Jan 27 00:07:00 crc systemd[1]: Starting Kubernetes Kubelet... Jan 27 00:07:00 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.103461 4774 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108783 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108815 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108825 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108836 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108845 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108855 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108869 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108904 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108913 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108922 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108930 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108938 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108946 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108954 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108961 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108969 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108977 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108985 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109024 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109035 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109045 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109053 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109062 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109071 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109080 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109089 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109096 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109106 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109114 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109124 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109136 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109147 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109157 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109165 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109173 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109181 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109189 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109197 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109206 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109214 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109225 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109236 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109245 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109253 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109261 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109269 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109277 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109284 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109292 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109299 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109307 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109316 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109325 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109332 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109340 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109347 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109355 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109365 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109373 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109380 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109387 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109395 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109403 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109411 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109418 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109426 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109435 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109443 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109451 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109459 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109466 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109607 4774 flags.go:64] FLAG: --address="0.0.0.0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109623 4774 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109638 4774 flags.go:64] FLAG: --anonymous-auth="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109650 4774 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109662 4774 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109672 4774 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109685 4774 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109696 4774 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109706 4774 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109715 4774 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109725 4774 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109735 4774 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109745 4774 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109754 4774 flags.go:64] FLAG: --cgroup-root="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109762 4774 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109771 4774 flags.go:64] FLAG: --client-ca-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109780 4774 flags.go:64] FLAG: --cloud-config="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109789 4774 flags.go:64] FLAG: --cloud-provider="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109798 4774 flags.go:64] FLAG: --cluster-dns="[]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109808 4774 flags.go:64] FLAG: --cluster-domain="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109817 4774 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109826 4774 flags.go:64] FLAG: --config-dir="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109835 4774 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109845 4774 flags.go:64] FLAG: --container-log-max-files="5" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109862 4774 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109871 4774 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109912 4774 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109922 4774 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109932 4774 flags.go:64] FLAG: --contention-profiling="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109964 4774 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109974 4774 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109985 4774 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109994 4774 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110005 4774 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110014 4774 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110023 4774 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110032 4774 flags.go:64] FLAG: --enable-load-reader="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110041 4774 flags.go:64] FLAG: --enable-server="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110050 4774 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110062 4774 flags.go:64] FLAG: --event-burst="100" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110071 4774 flags.go:64] FLAG: --event-qps="50" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110080 4774 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110090 4774 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110099 4774 flags.go:64] FLAG: --eviction-hard="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110110 4774 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110119 4774 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110128 4774 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110137 4774 flags.go:64] FLAG: --eviction-soft="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110146 4774 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110155 4774 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110164 4774 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110173 4774 flags.go:64] FLAG: --experimental-mounter-path="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110182 4774 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110191 4774 flags.go:64] FLAG: --fail-swap-on="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110200 4774 flags.go:64] FLAG: --feature-gates="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110210 4774 flags.go:64] FLAG: --file-check-frequency="20s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110220 4774 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110229 4774 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110239 4774 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110249 4774 flags.go:64] FLAG: --healthz-port="10248" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110258 4774 flags.go:64] FLAG: --help="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110268 4774 flags.go:64] FLAG: --hostname-override="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110277 4774 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110286 4774 flags.go:64] FLAG: --http-check-frequency="20s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110295 4774 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110304 4774 flags.go:64] FLAG: --image-credential-provider-config="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110313 4774 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110323 4774 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110332 4774 flags.go:64] FLAG: --image-service-endpoint="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110340 4774 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110351 4774 flags.go:64] FLAG: --kube-api-burst="100" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110360 4774 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110370 4774 flags.go:64] FLAG: --kube-api-qps="50" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110378 4774 flags.go:64] FLAG: --kube-reserved="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110387 4774 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110396 4774 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110405 4774 flags.go:64] FLAG: --kubelet-cgroups="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110414 4774 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110422 4774 flags.go:64] FLAG: --lock-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110431 4774 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110440 4774 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110449 4774 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110462 4774 flags.go:64] FLAG: --log-json-split-stream="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110471 4774 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110480 4774 flags.go:64] FLAG: --log-text-split-stream="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110489 4774 flags.go:64] FLAG: --logging-format="text" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110498 4774 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110508 4774 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110516 4774 flags.go:64] FLAG: --manifest-url="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110525 4774 flags.go:64] FLAG: --manifest-url-header="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110536 4774 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110546 4774 flags.go:64] FLAG: --max-open-files="1000000" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110557 4774 flags.go:64] FLAG: --max-pods="110" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110566 4774 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110575 4774 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110584 4774 flags.go:64] FLAG: --memory-manager-policy="None" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110593 4774 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110602 4774 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110612 4774 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110621 4774 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110641 4774 flags.go:64] FLAG: --node-status-max-images="50" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110650 4774 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110660 4774 flags.go:64] FLAG: --oom-score-adj="-999" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110670 4774 flags.go:64] FLAG: --pod-cidr="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110679 4774 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110693 4774 flags.go:64] FLAG: --pod-manifest-path="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110702 4774 flags.go:64] FLAG: --pod-max-pids="-1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110712 4774 flags.go:64] FLAG: --pods-per-core="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110721 4774 flags.go:64] FLAG: --port="10250" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110730 4774 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110739 4774 flags.go:64] FLAG: --provider-id="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110749 4774 flags.go:64] FLAG: --qos-reserved="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110758 4774 flags.go:64] FLAG: --read-only-port="10255" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110767 4774 flags.go:64] FLAG: --register-node="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110776 4774 flags.go:64] FLAG: --register-schedulable="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110787 4774 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110804 4774 flags.go:64] FLAG: --registry-burst="10" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110813 4774 flags.go:64] FLAG: --registry-qps="5" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110823 4774 flags.go:64] FLAG: --reserved-cpus="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110832 4774 flags.go:64] FLAG: --reserved-memory="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110843 4774 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110853 4774 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110868 4774 flags.go:64] FLAG: --rotate-certificates="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110901 4774 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110911 4774 flags.go:64] FLAG: --runonce="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110920 4774 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110930 4774 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110939 4774 flags.go:64] FLAG: --seccomp-default="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110949 4774 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110958 4774 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110967 4774 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110976 4774 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110986 4774 flags.go:64] FLAG: --storage-driver-password="root" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110994 4774 flags.go:64] FLAG: --storage-driver-secure="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111003 4774 flags.go:64] FLAG: --storage-driver-table="stats" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111012 4774 flags.go:64] FLAG: --storage-driver-user="root" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111021 4774 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111030 4774 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111039 4774 flags.go:64] FLAG: --system-cgroups="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111049 4774 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111062 4774 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111071 4774 flags.go:64] FLAG: --tls-cert-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111080 4774 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111090 4774 flags.go:64] FLAG: --tls-min-version="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111099 4774 flags.go:64] FLAG: --tls-private-key-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111108 4774 flags.go:64] FLAG: --topology-manager-policy="none" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111117 4774 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111127 4774 flags.go:64] FLAG: --topology-manager-scope="container" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111135 4774 flags.go:64] FLAG: --v="2" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111147 4774 flags.go:64] FLAG: --version="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111158 4774 flags.go:64] FLAG: --vmodule="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111169 4774 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111178 4774 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111413 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111426 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111435 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111443 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111453 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111462 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111472 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111480 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111488 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111496 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111505 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111513 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111521 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111529 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111537 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111547 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111557 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111565 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111573 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111581 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111589 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111597 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111605 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111617 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111627 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111636 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111646 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111656 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111664 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111673 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111682 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111690 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111698 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111708 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111716 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111724 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111732 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111740 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111748 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111756 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111764 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111772 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111781 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111789 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111796 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111804 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111812 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111820 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111830 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111840 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111847 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111862 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111870 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111900 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111908 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111916 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111924 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111932 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111942 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111953 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111963 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111972 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111981 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111989 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111998 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112006 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112013 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112021 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112029 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112036 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112044 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.112057 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.121143 4774 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.121180 4774 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122180 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122450 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122466 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122484 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122497 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122509 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122521 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122533 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122546 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122557 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122787 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122803 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122814 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122824 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122836 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122849 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122897 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123152 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123166 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123175 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123186 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123197 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123206 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123215 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123658 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123671 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123680 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123693 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123708 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123717 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123726 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123734 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123743 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123751 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123759 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123770 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123778 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123787 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123795 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123803 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123814 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123824 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123833 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123843 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123854 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123868 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123900 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123910 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123918 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123926 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123934 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123942 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123949 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123957 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123965 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123974 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123981 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123991 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124002 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124011 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124020 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124028 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124038 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124046 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124055 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124063 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124071 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124080 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124088 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124096 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124105 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.124119 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124386 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124401 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124410 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124419 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124428 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124436 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124444 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124453 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124461 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124469 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124477 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124485 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124493 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124502 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124510 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124519 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124527 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124535 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124547 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124560 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124571 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124580 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124589 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124601 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124611 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124623 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124634 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124646 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124657 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124668 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124679 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124690 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124701 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124712 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124722 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124732 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124742 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124751 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124759 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124767 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124775 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124782 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124790 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124797 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124805 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124813 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124821 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124829 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124837 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124844 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124852 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124867 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124874 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124906 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124914 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124924 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124934 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124942 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124951 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124960 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124970 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124979 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124988 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124996 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125004 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125014 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125024 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125033 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125041 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125049 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125057 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.125069 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.125965 4774 server.go:940] "Client rotation is on, will bootstrap in background" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.135150 4774 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.135592 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.137617 4774 server.go:997] "Starting client certificate rotation" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.137675 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.138889 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-13 07:12:00.938935368 +0000 UTC Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.139039 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.162845 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.165701 4774 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.166092 4774 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.183437 4774 log.go:25] "Validated CRI v1 runtime API" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.224720 4774 log.go:25] "Validated CRI v1 image API" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.229448 4774 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.234247 4774 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-27-00-01-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.234321 4774 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.270207 4774 manager.go:217] Machine: {Timestamp:2026-01-27 00:07:02.265728101 +0000 UTC m=+0.571505055 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e18b2370-db20-4c66-88f9-fff4652ef035 BootID:798433ee-0aed-45e3-8b2f-39b7bf5cbb06 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:af:4d:f9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:af:4d:f9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:07:29:13 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b4:fd:3e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7b:50:df Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ea:e7:8d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:34:06:c0:b8:79 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:7a:be:f5:90:45 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.270606 4774 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.270818 4774 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.273237 4774 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.273647 4774 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.273714 4774 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274221 4774 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274242 4774 container_manager_linux.go:303] "Creating device plugin manager" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274771 4774 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274827 4774 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.275158 4774 state_mem.go:36] "Initialized new in-memory state store" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.276071 4774 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280502 4774 kubelet.go:418] "Attempting to sync node with API server" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280550 4774 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280606 4774 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280634 4774 kubelet.go:324] "Adding apiserver pod source" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280657 4774 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.285024 4774 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.285100 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.285167 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.285490 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.285557 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.285951 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.288145 4774 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289919 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289942 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289951 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289960 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289975 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289985 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289994 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290006 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290014 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290023 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290032 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290038 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.292113 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.292458 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.292911 4774 server.go:1280] "Started kubelet" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294086 4774 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294121 4774 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294681 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294705 4774 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294715 4774 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294742 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 08:53:47.390547811 +0000 UTC Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294841 4774 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294866 4774 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.295022 4774 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 27 00:07:02 crc systemd[1]: Started Kubernetes Kubelet. Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.294982 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.295824 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.295968 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297176 4774 factory.go:55] Registering systemd factory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297222 4774 factory.go:221] Registration of the systemd container factory successfully Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297539 4774 factory.go:153] Registering CRI-O factory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297581 4774 factory.go:221] Registration of the crio container factory successfully Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297670 4774 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297697 4774 factory.go:103] Registering Raw factory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297714 4774 manager.go:1196] Started watching for new ooms in manager Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.297674 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="200ms" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.306704 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e6dcb029d1889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,LastTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.312490 4774 manager.go:319] Starting recovery of all containers Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.312647 4774 server.go:460] "Adding debug handlers to kubelet server" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321348 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321412 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321486 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321518 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321533 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321548 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321563 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321581 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321600 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321616 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321633 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321651 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321667 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321685 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321701 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321719 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322102 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322121 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322136 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322151 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322169 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322186 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322201 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322257 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322287 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322336 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322355 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322372 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322387 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322402 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322418 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322436 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322452 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322468 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322482 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322499 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322537 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322553 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322569 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322585 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322601 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322618 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322635 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322652 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322672 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322690 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322707 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322729 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322746 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322763 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322783 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322803 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322826 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322846 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322990 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323019 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323039 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323058 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323075 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323092 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323109 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323127 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323178 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323196 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323211 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323227 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323275 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323292 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323309 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323324 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323370 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323416 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323800 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323828 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323847 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323888 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323908 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323926 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.324937 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325049 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325084 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325114 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325145 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325172 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325202 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325240 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325272 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325302 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325333 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325360 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325387 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325416 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325445 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325473 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325501 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325533 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325560 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325590 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325617 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325647 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325678 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325708 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325734 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325763 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325812 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325847 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325932 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325968 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325998 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326028 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326059 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326093 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326124 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326152 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326182 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326211 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326239 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326270 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326346 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326376 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326408 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326437 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326463 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326492 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326518 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326544 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326572 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326606 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326635 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326660 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326691 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326718 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326749 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326778 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326808 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326841 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326903 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326935 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326971 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327014 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327042 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327068 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327097 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327131 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327158 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327184 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327212 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327238 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327267 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327294 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327321 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327349 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327376 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327402 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327428 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327457 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327487 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327565 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327598 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327630 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327654 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327683 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327709 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327734 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327773 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327798 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327826 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327851 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327914 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327944 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327973 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328000 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328024 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328049 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328076 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328110 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328134 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328164 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328191 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328222 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328249 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328271 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328295 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328318 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328344 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328369 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328395 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328418 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328445 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328472 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328500 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328530 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328554 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328583 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328610 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328634 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328664 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328691 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328716 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328745 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328775 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328803 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328830 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328854 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331153 4774 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331209 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331244 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331277 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331302 4774 reconstruct.go:97] "Volume reconstruction finished" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331318 4774 reconciler.go:26] "Reconciler: start to sync state" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.334361 4774 manager.go:324] Recovery completed Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.347790 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.352285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.352333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.352358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353278 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353736 4774 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353774 4774 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353809 4774 state_mem.go:36] "Initialized new in-memory state store" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.355259 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.355314 4774 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.355352 4774 kubelet.go:2335] "Starting kubelet main sync loop" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.355408 4774 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.359104 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.359192 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.370071 4774 policy_none.go:49] "None policy: Start" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.371560 4774 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.371586 4774 state_mem.go:35] "Initializing new in-memory state store" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.395379 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417109 4774 manager.go:334] "Starting Device Plugin manager" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417155 4774 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417167 4774 server.go:79] "Starting device plugin registration server" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417553 4774 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417569 4774 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.419385 4774 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.419512 4774 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.419521 4774 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.427434 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.456224 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.456326 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457725 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457852 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457904 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458806 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458954 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458994 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459558 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459683 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459727 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460582 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460726 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460783 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461596 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461619 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.462314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.462352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.462363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.508242 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="400ms" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.518711 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519776 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.520244 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534198 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534281 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534390 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534430 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534507 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534531 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534551 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534571 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534593 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534614 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534730 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636133 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636176 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636242 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636266 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636313 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636332 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636358 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636486 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636561 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636573 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636586 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636601 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636432 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636619 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636646 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636695 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636671 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636717 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636756 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636756 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636773 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636971 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.720771 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722076 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.722505 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.784379 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.789905 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.796717 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.820462 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.827319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.842112 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477 WatchSource:0}: Error finding container 344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477: Status 404 returned error can't find the container with id 344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477 Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.844514 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2 WatchSource:0}: Error finding container 13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2: Status 404 returned error can't find the container with id 13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2 Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.849131 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c WatchSource:0}: Error finding container 9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c: Status 404 returned error can't find the container with id 9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.856699 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946 WatchSource:0}: Error finding container 24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946: Status 404 returned error can't find the container with id 24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946 Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.862464 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8 WatchSource:0}: Error finding container 315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8: Status 404 returned error can't find the container with id 315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8 Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.909332 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="800ms" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.123555 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126759 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.127231 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.293245 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.295281 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 13:37:37.237960573 +0000 UTC Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.359529 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.360414 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.361947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.362957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.365093 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8"} Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.366730 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.366803 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.474309 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.474404 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.709929 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="1.6s" Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.832421 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.832527 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.927343 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.928989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.929025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.929037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.929065 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.929607 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.936045 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.936142 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:04 crc kubenswrapper[4774]: E0127 00:07:04.196598 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e6dcb029d1889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,LastTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.293765 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.295802 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:58:05.587295729 +0000 UTC Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.366412 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 00:07:04 crc kubenswrapper[4774]: E0127 00:07:04.367331 4774 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.369833 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.370096 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.372825 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.376897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.376935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.376947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.380370 4774 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.380455 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.380581 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.384190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.384214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.384224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.385698 4774 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.385819 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.386412 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.386994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.387027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.387037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390447 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390490 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390506 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390520 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390614 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.391973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.392038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.392064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.407677 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.409302 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.409903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.413447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.413481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.413494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.415353 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.416297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.416368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.416388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.293760 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.296001 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 17:15:04.561834927 +0000 UTC Jan 27 00:07:05 crc kubenswrapper[4774]: E0127 00:07:05.310585 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="3.2s" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.416056 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d0e4428feac8787d889cfa23715983ac112bac861e3e00ab7ae19ec676964eb4"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.416119 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.417419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.417449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.417460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.424899 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425054 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425166 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429035 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429080 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429101 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429119 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429137 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429273 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.430210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.430248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.430265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.432767 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8" exitCode=0 Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.432814 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.432879 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433047 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.434526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.434548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.434564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.463065 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.537911 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.538993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.539030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.539042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.539069 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:05 crc kubenswrapper[4774]: E0127 00:07:05.539474 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:05 crc kubenswrapper[4774]: W0127 00:07:05.580368 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:05 crc kubenswrapper[4774]: E0127 00:07:05.580438 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.296573 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 03:51:15.368636641 +0000 UTC Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440025 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d" exitCode=0 Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440257 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440334 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440404 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440440 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440475 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440340 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440999 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d"} Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.441093 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.700650 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.297165 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 07:06:57.386613272 +0000 UTC Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445892 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445941 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445951 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445961 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445969 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445989 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445999 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445999 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.798336 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.254146 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.254400 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.256210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.256259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.256272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.294287 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.298351 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:56:27.091026964 +0000 UTC Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.410350 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.449098 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.449145 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.449195 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.450916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.450975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.450920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.452011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.730659 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.740354 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742126 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742201 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.805749 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.298590 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 21:18:23.695541619 +0000 UTC Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.451155 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.451345 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:10 crc kubenswrapper[4774]: I0127 00:07:10.299367 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 11:42:04.254806817 +0000 UTC Jan 27 00:07:11 crc kubenswrapper[4774]: I0127 00:07:11.299827 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:13:25.59936718 +0000 UTC Jan 27 00:07:11 crc kubenswrapper[4774]: I0127 00:07:11.806429 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 00:07:11 crc kubenswrapper[4774]: I0127 00:07:11.806574 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.300532 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 00:41:09.599863086 +0000 UTC Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.339999 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.340243 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.341786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.341850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.341896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.359930 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 27 00:07:12 crc kubenswrapper[4774]: E0127 00:07:12.427590 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.461534 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.463281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.463348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.463372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.301104 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 09:15:22.562955207 +0000 UTC Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.381046 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.381376 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.383797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.384172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.384359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.392118 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.463654 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.465040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.465116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.465141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:14 crc kubenswrapper[4774]: I0127 00:07:14.301388 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 16:31:54.600609977 +0000 UTC Jan 27 00:07:15 crc kubenswrapper[4774]: I0127 00:07:15.301949 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:02:45.967839176 +0000 UTC Jan 27 00:07:15 crc kubenswrapper[4774]: W0127 00:07:15.839765 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:15 crc kubenswrapper[4774]: I0127 00:07:15.839938 4774 trace.go:236] Trace[238075526]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:05.838) (total time: 10001ms): Jan 27 00:07:15 crc kubenswrapper[4774]: Trace[238075526]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:15.839) Jan 27 00:07:15 crc kubenswrapper[4774]: Trace[238075526]: [10.001749827s] [10.001749827s] END Jan 27 00:07:15 crc kubenswrapper[4774]: E0127 00:07:15.839971 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.295535 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.302677 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:48:26.66934062 +0000 UTC Jan 27 00:07:16 crc kubenswrapper[4774]: W0127 00:07:16.317043 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.317179 4774 trace.go:236] Trace[860708573]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:06.315) (total time: 10002ms): Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[860708573]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:16.317) Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[860708573]: [10.002120217s] [10.002120217s] END Jan 27 00:07:16 crc kubenswrapper[4774]: E0127 00:07:16.317296 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 27 00:07:16 crc kubenswrapper[4774]: W0127 00:07:16.411944 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.412363 4774 trace.go:236] Trace[728287144]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:06.410) (total time: 10002ms): Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[728287144]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:16.411) Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[728287144]: [10.002118508s] [10.002118508s] END Jan 27 00:07:16 crc kubenswrapper[4774]: E0127 00:07:16.412525 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.474554 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.476983 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7" exitCode=255 Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.477031 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7"} Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.477179 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.477975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.478007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.478017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.478476 4774 scope.go:117] "RemoveContainer" containerID="5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.153807 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.153930 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.157641 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.157701 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.303180 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 04:04:54.521765215 +0000 UTC Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.482112 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.485332 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c"} Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.485603 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.487004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.487061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.487081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.303903 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 07:42:40.401504064 +0000 UTC Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.740367 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.740574 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.740760 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.742111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.742155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.742170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.747285 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.304022 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 03:54:04.515640344 +0000 UTC Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.493133 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.494577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.494623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.494635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.304353 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:17:17.12335762 +0000 UTC Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.495504 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.496229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.496256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.496265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.304751 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 14:07:34.148355546 +0000 UTC Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.806623 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.806710 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.819083 4774 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.150570 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.154301 4774 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.157544 4774 trace.go:236] Trace[594448485]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:11.346) (total time: 10811ms): Jan 27 00:07:22 crc kubenswrapper[4774]: Trace[594448485]: ---"Objects listed" error: 10811ms (00:07:22.157) Jan 27 00:07:22 crc kubenswrapper[4774]: Trace[594448485]: [10.811328741s] [10.811328741s] END Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.157575 4774 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.161402 4774 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.163679 4774 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.163782 4774 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.163801 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168472 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.181603 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185520 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.187571 4774 csr.go:261] certificate signing request csr-mj7kx is approved, waiting to be issued Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.196047 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.196754 4774 csr.go:257] certificate signing request csr-mj7kx is issued Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201271 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.233032 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239356 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.255387 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.255500 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.255523 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.266941 4774 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.292708 4774 apiserver.go:52] "Watching apiserver" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.295789 4774 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296016 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296322 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296334 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.296373 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296428 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.296450 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296687 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296779 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.296808 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.299224 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.299356 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.299962 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300000 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300086 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300552 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300729 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300758 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300987 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.305013 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 20:48:28.86236562 +0000 UTC Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.331274 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.343469 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.356374 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359738 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.376576 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.389659 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.392392 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.395614 4774 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.399683 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.408347 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.414517 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.422663 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.435905 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.445358 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456343 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456399 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456424 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456453 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456479 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456502 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456524 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456689 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456731 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456758 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456802 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456830 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456877 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456923 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456946 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456984 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457011 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457033 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457070 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457118 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457136 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457166 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457200 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457171 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457222 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457249 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457270 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457293 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457315 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457339 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457381 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457404 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457428 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457451 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457474 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457494 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457521 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457540 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457559 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457630 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457651 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457669 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457686 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457706 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457727 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457752 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457771 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457792 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457810 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457830 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457875 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457922 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457942 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457961 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457979 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457997 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458019 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458037 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458058 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458107 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458131 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458150 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458167 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458185 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458202 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458225 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458243 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458261 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458280 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458304 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458323 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458369 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458388 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458409 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458428 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458448 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458467 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458486 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458505 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458523 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458540 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458558 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458576 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458593 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458611 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458629 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458647 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458668 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458685 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458705 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458724 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458742 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458761 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458780 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458814 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458834 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458870 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458896 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458926 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458948 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458967 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458982 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458999 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459015 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459033 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459053 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459075 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459095 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459111 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459131 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459151 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459166 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459187 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459220 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459239 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459257 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459219 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459277 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459296 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459314 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459334 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459357 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459381 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459423 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459441 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459462 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459481 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459514 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459533 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459550 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459567 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459586 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459616 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459632 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459648 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459665 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459683 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459699 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459715 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459732 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459752 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459769 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459788 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459806 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459825 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459842 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457493 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457689 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458495 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458817 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459849 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460270 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460430 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460459 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460501 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460552 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460612 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460688 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460827 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460763 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460912 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460870 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460939 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460935 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461151 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461125 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461226 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461256 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461343 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461351 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461438 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461492 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461509 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461586 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461593 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461566 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461645 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461706 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461771 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461997 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462443 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462471 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462508 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462566 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462589 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462809 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462811 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462830 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462849 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462950 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462981 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463115 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463268 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463273 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463214 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463450 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463630 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463830 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.464168 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.464794 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.465621 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.465949 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466222 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466233 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466509 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466702 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.467289 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:22.967255891 +0000 UTC m=+21.273032995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.467532 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468031 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468299 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468427 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468679 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468697 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468997 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469108 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469108 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469363 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469402 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469652 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469915 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469925 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469957 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469972 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470265 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470354 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470489 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462886 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470700 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470916 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470963 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471182 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471129 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471360 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471518 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471547 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471645 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471759 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471781 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471806 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471836 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471885 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471989 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.472149 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.472604 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.472994 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473418 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473453 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473481 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473501 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473520 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473544 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473368 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473729 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.474050 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.474777 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475253 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475504 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475903 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475993 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.477063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.477686 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.477834 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.478348 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.479121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.479111 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.479593 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.480522 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.480756 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.484486 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.484562 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.484848 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.485128 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.485319 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.487064 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.487871 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.488127 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.488539 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489015 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489105 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489318 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489528 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489573 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489956 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.490269 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.490508 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.490741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.491218 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.491516 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.492027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494147 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494210 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494418 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494491 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494531 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494554 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494584 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494607 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494635 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494656 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494679 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494702 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494721 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494739 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494757 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494759 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494774 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494777 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494795 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494815 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494834 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494851 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494888 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494932 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494955 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494983 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495002 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494939 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495029 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495052 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495074 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495097 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495134 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495150 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495174 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495268 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495296 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502709 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502743 4774 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502771 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502789 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502801 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502812 4774 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502836 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502847 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502875 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502886 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502901 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502913 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502924 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502936 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502946 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502955 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502974 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502988 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502999 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503010 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503022 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503035 4774 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503045 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503056 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503602 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503618 4774 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503714 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503735 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503969 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503987 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503999 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504011 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504027 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504302 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504320 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504372 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504389 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504401 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504437 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504454 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504465 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506270 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506316 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506333 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507566 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507599 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507619 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507633 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507650 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507662 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507675 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507696 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507713 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507728 4774 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507739 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507754 4774 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507764 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507774 4774 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507785 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495169 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498017 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507892 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506237 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498278 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508054 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508115 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508150 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508163 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508186 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508202 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498354 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509630 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498716 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509672 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.509691 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499280 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499370 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499409 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509946 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.500199 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503129 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510050 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508215 4774 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503443 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503804 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503830 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504893 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.505022 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.505453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.505889 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.510415 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.009749024 +0000 UTC m=+21.315525908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510502 4774 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510601 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510635 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510659 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510846 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510962 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511094 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511127 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511148 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511164 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511179 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511192 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511203 4774 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511213 4774 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511224 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506023 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.496616 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.496751 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506066 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506719 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507381 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.511297 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.011222835 +0000 UTC m=+21.316999839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507707 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.505954 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509411 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509434 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499002 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509881 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511426 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511475 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511518 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512118 4774 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512165 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512209 4774 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512250 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512594 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512757 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513117 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513313 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513498 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513584 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514822 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512266 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514972 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514997 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515013 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515028 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515043 4774 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515057 4774 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515075 4774 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515090 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515102 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515117 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515132 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515146 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515159 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515174 4774 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515190 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515206 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515220 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515233 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515251 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515263 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515277 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515290 4774 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515314 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515328 4774 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515341 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515356 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515369 4774 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515382 4774 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515395 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515408 4774 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515420 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515476 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515491 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515504 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515517 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515531 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515544 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515556 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515569 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515582 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515594 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515607 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515621 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515636 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515651 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.521636 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522024 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522186 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522425 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522531 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522584 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523011 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523026 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523039 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523102 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523118 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523195 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523261 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.524600 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.529751 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514178 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.530992 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531019 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531034 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531113 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.031084658 +0000 UTC m=+21.336861542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531242 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531266 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531319 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531402 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.031386866 +0000 UTC m=+21.337163750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.536337 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.536627 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.539012 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.544574 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.545275 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.546336 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.546478 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.550843 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.551057 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.554193 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.563549 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.565531 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616593 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616654 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616735 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616782 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616772 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616808 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616797 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616922 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616936 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616945 4774 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616956 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616965 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616975 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616984 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616992 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617004 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617014 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617022 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617030 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617039 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617048 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617056 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617064 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617072 4774 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617081 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617088 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617096 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617105 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617113 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617121 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617130 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617138 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617148 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617157 4774 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617166 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617174 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617185 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617193 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617201 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617209 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617222 4774 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617230 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617238 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617246 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617254 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617263 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617271 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617279 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617287 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617296 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617305 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617314 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617322 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617331 4774 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617341 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617350 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617359 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617380 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617389 4774 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617404 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617415 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617424 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617433 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617442 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617452 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617462 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617472 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617480 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617489 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617498 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617507 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617516 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.620332 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.622295 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637319 4774 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637212 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637639 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.663757 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.685842 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.709336 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.719750 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739679 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.907615 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.915594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: W0127 00:07:22.926798 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173 WatchSource:0}: Error finding container ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173: Status 404 returned error can't find the container with id ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944566 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944677 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.020884 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.020981 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.021007 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021043 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.021011568 +0000 UTC m=+22.326788472 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021094 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021169 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.021134761 +0000 UTC m=+22.326911645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021978 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.022109 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.022067747 +0000 UTC m=+22.327844641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046849 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.121834 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.121900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.121997 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122004 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122013 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122021 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122025 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122031 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122067 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.122055521 +0000 UTC m=+22.427832405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122080 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.122074932 +0000 UTC m=+22.427851816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.198170 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-27 00:02:22 +0000 UTC, rotation deadline is 2026-10-15 18:07:14.396689037 +0000 UTC Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.198260 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6281h59m51.198432172s for next certificate rotation Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250770 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.306132 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 21:16:24.767943871 +0000 UTC Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354211 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456568 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.475643 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8mtkj"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.476010 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.477973 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.478032 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.478915 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.489988 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.502386 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.514611 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.518146 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.518198 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.518210 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"453c4954e146e01f71abf01febc41d8911b76c8cec3fd440a447e8f30a89e500"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.519722 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.520240 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.522095 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" exitCode=255 Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.522151 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.522219 4774 scope.go:117] "RemoveContainer" containerID="5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.523552 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525318 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb889\" (UniqueName: \"kubernetes.io/projected/052efff3-b53a-4586-8ccf-f8e9b1a47174-kube-api-access-jb889\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525400 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/052efff3-b53a-4586-8ccf-f8e9b1a47174-hosts-file\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525987 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ed0aeac8215083856262c3a8929ff514c2bb0a9d94c84b6d6fba293d94ce8c43"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.531381 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.531950 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.532148 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.537419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.548461 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558106 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.564481 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.576574 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.593691 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.606389 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.618712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.626608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/052efff3-b53a-4586-8ccf-f8e9b1a47174-hosts-file\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.626668 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb889\" (UniqueName: \"kubernetes.io/projected/052efff3-b53a-4586-8ccf-f8e9b1a47174-kube-api-access-jb889\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.626906 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/052efff3-b53a-4586-8ccf-f8e9b1a47174-hosts-file\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.629178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.646946 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb889\" (UniqueName: \"kubernetes.io/projected/052efff3-b53a-4586-8ccf-f8e9b1a47174-kube-api-access-jb889\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.651873 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.659771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.659966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.659989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.660008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.660023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.663127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.676059 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.688001 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.701072 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.714404 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762282 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.788521 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: W0127 00:07:23.797100 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod052efff3_b53a_4586_8ccf_f8e9b1a47174.slice/crio-832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3 WatchSource:0}: Error finding container 832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3: Status 404 returned error can't find the container with id 832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3 Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.860765 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2nl9s"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.861047 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-57k5g"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.861366 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864339 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mtz9l"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864539 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864611 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864633 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864544 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.865049 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.865743 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.865769 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.866430 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868572 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868596 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868901 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868927 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.869119 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.869283 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.869385 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.872616 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873687 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873874 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.874070 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.874230 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.875714 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.876518 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.879100 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.885526 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.906326 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.921309 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.928839 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.928986 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd9gc\" (UniqueName: \"kubernetes.io/projected/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-kube-api-access-dd9gc\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-bin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929183 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929343 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cni-binary-copy\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929422 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-multus\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929510 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929605 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-etc-kubernetes\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929695 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9bv\" (UniqueName: \"kubernetes.io/projected/c578fd22-1672-431a-9914-66f55a0260bd-kube-api-access-7b9bv\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929791 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929903 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930116 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-os-release\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930277 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-netns\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930530 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930612 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930690 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-hostroot\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930778 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930879 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmt7t\" (UniqueName: \"kubernetes.io/projected/0abcf78e-9b05-4b89-94f3-4d3230886ce0-kube-api-access-pmt7t\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930963 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-proxy-tls\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931043 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-rootfs\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931124 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-cnibin\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931220 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931304 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931403 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-k8s-cni-cncf-io\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931578 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-multus-certs\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931676 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-system-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931758 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cnibin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931880 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-daemon-config\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.932008 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-os-release\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.932064 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.933971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934072 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934192 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934298 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934401 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-system-cni-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934484 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934619 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934671 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934711 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-socket-dir-parent\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934739 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-kubelet\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934764 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-conf-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934792 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.936242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.951420 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.963485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.976010 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.985788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.997832 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.010116 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.018959 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035246 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.035348 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.03532957 +0000 UTC m=+24.341106454 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035436 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035454 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-bin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035311 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035468 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035484 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035500 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd9gc\" (UniqueName: \"kubernetes.io/projected/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-kube-api-access-dd9gc\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035515 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-multus\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035545 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-bin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035576 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035607 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035614 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035646 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035667 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035669 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-multus\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035721 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cni-binary-copy\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.035905 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.035968 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.035957047 +0000 UTC m=+24.341733981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036114 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9bv\" (UniqueName: \"kubernetes.io/projected/c578fd22-1672-431a-9914-66f55a0260bd-kube-api-access-7b9bv\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036238 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036346 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036426 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-etc-kubernetes\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036312 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036431 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cni-binary-copy\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036347 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036455 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-etc-kubernetes\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036510 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036652 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-os-release\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036677 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-netns\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036697 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036720 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036739 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036757 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036759 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-hostroot\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036788 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036792 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-hostroot\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036804 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036819 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmt7t\" (UniqueName: \"kubernetes.io/projected/0abcf78e-9b05-4b89-94f3-4d3230886ce0-kube-api-access-pmt7t\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036837 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-rootfs\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036866 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-proxy-tls\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036882 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036897 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036914 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-cnibin\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036931 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-k8s-cni-cncf-io\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036970 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-multus-certs\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036991 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037007 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037022 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-system-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037036 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cnibin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037053 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-os-release\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037072 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037089 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037106 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-daemon-config\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037147 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037161 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037185 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037199 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-system-cni-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037215 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037246 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037263 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-socket-dir-parent\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.037263 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037277 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-kubelet\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037292 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-conf-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.037313 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.037299724 +0000 UTC m=+24.343076618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037320 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-conf-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037394 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037407 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036740 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-os-release\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037443 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-netns\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037452 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037483 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037483 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037510 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-cnibin\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037522 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-k8s-cni-cncf-io\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037587 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-multus-certs\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037633 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037684 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-system-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037702 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-rootfs\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037727 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cnibin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-os-release\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037800 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037814 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037835 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037878 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-system-cni-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-socket-dir-parent\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038014 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-kubelet\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037212 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038080 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-daemon-config\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038597 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038920 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.039352 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.051253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-proxy-tls\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.055249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9bv\" (UniqueName: \"kubernetes.io/projected/c578fd22-1672-431a-9914-66f55a0260bd-kube-api-access-7b9bv\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.063297 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd9gc\" (UniqueName: \"kubernetes.io/projected/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-kube-api-access-dd9gc\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.064486 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmt7t\" (UniqueName: \"kubernetes.io/projected/0abcf78e-9b05-4b89-94f3-4d3230886ce0-kube-api-access-pmt7t\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.069476 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078001 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078966 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.103992 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.131545 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.138441 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.138475 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138583 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138594 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138602 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138610 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138623 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138626 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138662 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.138650956 +0000 UTC m=+24.444427840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138673 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.138668726 +0000 UTC m=+24.444445610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.154278 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.168490 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181337 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.187735 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.192269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.200011 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: W0127 00:07:24.201058 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3605b94c_c171_4ff3_a3c9_d8e6a7cf7a9a.slice/crio-c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c WatchSource:0}: Error finding container c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c: Status 404 returned error can't find the container with id c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.201134 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.208460 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.216917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.221215 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: W0127 00:07:24.230742 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc578fd22_1672_431a_9914_66f55a0260bd.slice/crio-a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5 WatchSource:0}: Error finding container a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5: Status 404 returned error can't find the container with id a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5 Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.233266 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: W0127 00:07:24.244660 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb881c9d_a960_48ae_93bf_d0ccd687e0b9.slice/crio-0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83 WatchSource:0}: Error finding container 0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83: Status 404 returned error can't find the container with id 0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83 Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.250073 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.276596 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286545 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.293642 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.307141 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 07:22:17.506055782 +0000 UTC Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.356474 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.356821 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.356541 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.356917 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.356499 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.356971 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.365299 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.365824 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.367241 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.368060 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.369188 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.369813 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.370504 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.371500 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.372227 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.373302 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.373840 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.375396 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.376139 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.378278 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.379353 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.380548 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.381673 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.386209 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.387090 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.388372 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.389027 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.389629 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.391140 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.395111 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.397030 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.398344 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.399974 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.401698 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.402977 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.404280 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.405013 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.410175 4774 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.410404 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.413955 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.415385 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.416096 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.418166 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.419491 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.421011 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.423890 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.424834 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.426114 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.427006 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.428336 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.429612 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.430279 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.431419 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.432157 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.436456 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.437010 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.437984 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.438512 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.439208 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.440472 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.441074 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499454 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499508 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.531106 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" exitCode=0 Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.531198 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.531269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.532576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.532624 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.535086 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.538016 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.538159 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.539692 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.539743 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"5b6b8d7c26b8d55b60d556c03d62d48dff6444cff0ca626540bec4b5124c5cc3"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.541371 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.541843 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.541939 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.553109 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8mtkj" event={"ID":"052efff3-b53a-4586-8ccf-f8e9b1a47174","Type":"ContainerStarted","Data":"07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.553159 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8mtkj" event={"ID":"052efff3-b53a-4586-8ccf-f8e9b1a47174","Type":"ContainerStarted","Data":"832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.571039 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.586798 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601902 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.607142 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.620066 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.633163 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.654105 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.669789 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.686114 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.703904 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704958 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.719894 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.733244 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.744504 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.766131 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.783301 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.799319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808489 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.818011 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.829507 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.853061 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.879459 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.899256 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911274 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.919230 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.956303 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.985662 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.998547 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013597 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.020821 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.035232 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116352 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219817 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.308329 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 23:09:18.563032012 +0000 UTC Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323802 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426311 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.529417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.529793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.530119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.530230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.530294 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.557410 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562284 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562337 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562354 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562367 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562381 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.563429 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a" exitCode=0 Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.564006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.576017 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.594994 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.615973 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632487 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.633003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.633014 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.649145 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.666759 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.680575 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.702093 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.723216 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737443 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.739153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.755502 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.766284 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.788018 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.802588 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.814055 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.827018 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.841909 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.853843 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.871336 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.883681 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.905263 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.920333 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.933682 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.949150 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.965731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.986224 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047311 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.059775 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060014 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.059981224 +0000 UTC m=+28.365758118 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.060073 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.060165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060314 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060390 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.060373115 +0000 UTC m=+28.366149999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060323 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060567 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.06055349 +0000 UTC m=+28.366330604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.109120 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-g4cnl"] Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.109504 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.112551 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.112574 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.113198 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.114345 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.134031 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.150012 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.150444 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161149 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8f7b\" (UniqueName: \"kubernetes.io/projected/2c687c86-483c-433c-8a0c-a232c5d48974-kube-api-access-v8f7b\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161236 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c687c86-483c-433c-8a0c-a232c5d48974-host\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161268 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c687c86-483c-433c-8a0c-a232c5d48974-serviceca\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161301 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161330 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161475 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161501 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161512 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161546 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161577 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.161558672 +0000 UTC m=+28.467335556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161583 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161613 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161700 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.161675126 +0000 UTC m=+28.467452050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.170562 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.195800 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.217607 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.231615 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.249902 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253277 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.262916 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8f7b\" (UniqueName: \"kubernetes.io/projected/2c687c86-483c-433c-8a0c-a232c5d48974-kube-api-access-v8f7b\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.263060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c687c86-483c-433c-8a0c-a232c5d48974-host\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.263116 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c687c86-483c-433c-8a0c-a232c5d48974-serviceca\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.263248 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c687c86-483c-433c-8a0c-a232c5d48974-host\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.264303 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c687c86-483c-433c-8a0c-a232c5d48974-serviceca\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.271426 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.287097 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.288821 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8f7b\" (UniqueName: \"kubernetes.io/projected/2c687c86-483c-433c-8a0c-a232c5d48974-kube-api-access-v8f7b\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.301236 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.309444 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 12:43:25.667104402 +0000 UTC Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.314888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.346436 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.356203 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.356338 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357084 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357323 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.357381 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.357654 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.382120 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.422667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.432672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: W0127 00:07:26.440367 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c687c86_483c_433c_8a0c_a232c5d48974.slice/crio-6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf WatchSource:0}: Error finding container 6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf: Status 404 returned error can't find the container with id 6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460167 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.568891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.571366 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4cnl" event={"ID":"2c687c86-483c-433c-8a0c-a232c5d48974","Type":"ContainerStarted","Data":"6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.576031 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.600374 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.649975 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.667001 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.679086 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.692669 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.704277 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.714852 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.727584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.743743 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.768943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.768971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.768981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.769011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.769023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.783460 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.824886 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.863831 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871468 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.906116 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.960141 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973792 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.984775 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081345 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.183969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184039 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287286 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.310479 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:56:33.24036792 +0000 UTC Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493619 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.584397 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4cnl" event={"ID":"2c687c86-483c-433c-8a0c-a232c5d48974","Type":"ContainerStarted","Data":"8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.588802 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52" exitCode=0 Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.588891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596399 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.613515 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.640914 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.663004 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.684427 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699918 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.718888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.736600 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.748594 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.760961 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.775758 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.795292 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804569 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.814624 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.828488 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.855916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.872564 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.893623 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907386 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.912944 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.928936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.953888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.967965 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.989904 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.009519 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015623 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.035479 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.055392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.093615 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.112970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119559 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.127494 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.142560 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.162672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225667 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.312043 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 00:58:52.6490574 +0000 UTC Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.356646 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.356758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:28 crc kubenswrapper[4774]: E0127 00:07:28.356851 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.356910 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:28 crc kubenswrapper[4774]: E0127 00:07:28.357102 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:28 crc kubenswrapper[4774]: E0127 00:07:28.357270 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433820 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538572 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.599576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.604366 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74" exitCode=0 Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.604651 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.644345 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.646900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647515 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.665173 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.680520 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.712748 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.733268 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751571 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.755059 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.772408 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.793557 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.810056 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.810944 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.814954 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.822906 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.832161 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858697 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.859621 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.878499 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.895815 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.912015 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.928938 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.942630 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.959573 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963360 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.985009 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.004139 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.022044 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.039934 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.055660 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067314 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.074374 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.092075 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.124808 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170217 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.173121 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.211172 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.256674 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.273896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.273973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.273993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.274026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.274048 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.288477 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.312824 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:49:39.694073368 +0000 UTC Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377443 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480830 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585676 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.613985 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c" exitCode=0 Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.614093 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c"} Jan 27 00:07:29 crc kubenswrapper[4774]: E0127 00:07:29.625994 4774 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.650367 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.672284 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688512 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.690756 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.727654 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.748261 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.762717 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.784110 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791712 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.800379 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.816798 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.829539 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.841567 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.861262 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.878261 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.891622 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.907883 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998336 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100709 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.111101 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.111179 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.111230 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.111330 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.111371 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.111358124 +0000 UTC m=+36.417135008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.112022 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.11198607 +0000 UTC m=+36.417762954 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.112128 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.112175 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.112167885 +0000 UTC m=+36.417944769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.174471 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.175959 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.176267 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204409 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.212571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.212666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212891 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212906 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212950 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212970 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212923 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.213016 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.213057 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.213030194 +0000 UTC m=+36.518807108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.213083 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.213071065 +0000 UTC m=+36.518847989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.313084 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:19:17.400911378 +0000 UTC Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.356500 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.356560 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.356626 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.356705 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.356796 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.357059 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410423 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514724 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619192 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.625380 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.625913 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.625987 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.631116 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4" exitCode=0 Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.631736 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.646726 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.665566 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.670203 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.682967 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.713731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.722021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.731565 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.754420 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.785411 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.803767 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.819378 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832296 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.850415 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.869187 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.882151 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.894035 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.909850 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.927321 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.948634 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.963345 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.981248 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.997322 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.014528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.035327 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.040986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041072 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.054892 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.072611 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.103828 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.120254 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143728 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.146849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.161739 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.175392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.191178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.224729 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246746 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.313700 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:28:15.383430351 +0000 UTC Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349719 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452763 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.556901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.556992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.557013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.557091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.557154 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.641195 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a" exitCode=0 Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.641254 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.642277 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661441 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661511 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.685737 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.690346 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.712646 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.736606 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.757089 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.775945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.775994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776036 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776593 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.796303 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.813825 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.829703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.845033 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.873468 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884626 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.909546 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.939617 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.957369 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.980678 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987712 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.002653 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.014407 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.035159 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.050377 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.066902 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.084942 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.106819 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.126427 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.138731 4774 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.139308 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc/status\": read tcp 38.129.56.27:55304->38.129.56.27:6443: use of closed network connection" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.186196 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194436 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.209877 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.222731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.237153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.269525 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296914 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.303404 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.314674 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:56:38.686782435 +0000 UTC Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.355951 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.356000 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.355951 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.356121 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.356261 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.356367 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.383430 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.396703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399652 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.424955 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.469630 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502343 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.512160 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.545985 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.584111 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604771 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.629164 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631182 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.651040 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341"} Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.650827 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.667574 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.673656 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682765 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.703652 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.711933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.711989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712046 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712741 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.725433 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.732595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.732749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.733002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.733189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.733339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.746543 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.750013 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.750278 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752519 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.791507 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.827916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855714 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.864655 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.904391 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.945580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.985171 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.026916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.061981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062074 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.071607 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.109153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.150718 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165132 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.188813 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.230375 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.282989 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.305401 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.314929 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:02:13.437670063 +0000 UTC Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.350142 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369870 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369880 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.389219 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.431545 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.479230 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.505655 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.658677 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/0.log" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.663821 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069" exitCode=1 Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.663916 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.665005 4774 scope.go:117] "RemoveContainer" containerID="b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.681839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.681941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.681964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.682066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.682103 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.703656 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.723132 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.748958 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.777160 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.786067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.786532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.786803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.787045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.787249 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.806833 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.827339 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.847879 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.873527 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890443 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890990 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.903468 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.946837 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994448 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.996510 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.030598 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.075997 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.106306 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202281 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305677 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.316147 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 14:28:18.936091879 +0000 UTC Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.360483 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:34 crc kubenswrapper[4774]: E0127 00:07:34.360674 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.361331 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:34 crc kubenswrapper[4774]: E0127 00:07:34.361446 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.361532 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:34 crc kubenswrapper[4774]: E0127 00:07:34.361613 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409363 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512535 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.614970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615049 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.669088 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/0.log" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.671255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.683256 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.698326 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.713389 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.716982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717063 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.728489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.743281 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.758977 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.775508 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.793928 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.806384 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.819926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.819987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.819998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.820014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.820026 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.824216 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.836301 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.848264 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.860830 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.873012 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.885359 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.901032 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922795 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.025930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.025981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.025993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.026007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.026017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128982 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233172 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.316385 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:43:12.350698798 +0000 UTC Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336362 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438666 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542309 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644266 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.747939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.747991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.748001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.748018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.748029 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851694 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.954959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058672 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.316614 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:28:31.593737229 +0000 UTC Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.356558 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.356712 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.356849 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.356717 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.357085 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.357306 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369713 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.473974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474149 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.577919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681839 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.684521 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.685517 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/0.log" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.689719 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" exitCode=1 Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.689789 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.689894 4774 scope.go:117] "RemoveContainer" containerID="b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.690995 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.691385 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.722768 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.740917 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.762270 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.780214 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785439 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.802029 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.824043 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.829844 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6"] Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.830383 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.834236 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.834668 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.847453 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.865732 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888557 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.899016 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911557 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911678 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnp67\" (UniqueName: \"kubernetes.io/projected/936f5a31-13bf-456e-83d8-1aee977d2af5-kube-api-access-qnp67\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/936f5a31-13bf-456e-83d8-1aee977d2af5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911850 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911799 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.927135 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.944981 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.963602 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.986360 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991315 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013091 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnp67\" (UniqueName: \"kubernetes.io/projected/936f5a31-13bf-456e-83d8-1aee977d2af5-kube-api-access-qnp67\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013143 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/936f5a31-13bf-456e-83d8-1aee977d2af5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013183 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013234 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013749 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.014980 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.019804 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.028767 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/936f5a31-13bf-456e-83d8-1aee977d2af5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.044569 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.048972 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnp67\" (UniqueName: \"kubernetes.io/projected/936f5a31-13bf-456e-83d8-1aee977d2af5-kube-api-access-qnp67\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.061133 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.078309 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095453 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.096567 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.111070 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.150200 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.151174 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.182532 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198613 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.204621 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.220011 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.234792 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.251117 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.269953 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.283257 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.297296 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301636 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.314492 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.317305 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 22:06:14.174755994 +0000 UTC Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.336435 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405740 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508728 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612985 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.698238 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.701770 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:37 crc kubenswrapper[4774]: E0127 00:07:37.702081 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.704138 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" event={"ID":"936f5a31-13bf-456e-83d8-1aee977d2af5","Type":"ContainerStarted","Data":"feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.704204 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" event={"ID":"936f5a31-13bf-456e-83d8-1aee977d2af5","Type":"ContainerStarted","Data":"5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.704223 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" event={"ID":"936f5a31-13bf-456e-83d8-1aee977d2af5","Type":"ContainerStarted","Data":"986726594a406074d4dcdec60185f20bca763efacdbc4316b0a539ddc05c1c0e"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715957 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.724536 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.744267 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.757512 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.773727 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.789390 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.805801 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.817978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818279 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.823535 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.849272 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.866483 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.881211 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.893294 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.905619 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921476 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.922808 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.937153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.963108 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.986409 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.010713 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025126 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.044932 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.067005 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.083979 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.104083 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.122443 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127371 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.128116 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128318 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.128288733 +0000 UTC m=+52.434065757 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.128372 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.128489 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128529 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128599 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.128581331 +0000 UTC m=+52.434358215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128604 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128648 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.128639282 +0000 UTC m=+52.434416166 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.141656 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.158824 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.176821 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.206166 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.224788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.229447 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.229532 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229721 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229762 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229775 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229811 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229833 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229783 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229942 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.229918502 +0000 UTC m=+52.535695426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.230194 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.230096367 +0000 UTC m=+52.535873271 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230424 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.245217 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.262694 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.282519 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.306793 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.317481 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 12:24:22.063748476 +0000 UTC Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333807 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.336416 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.355955 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.355966 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.356068 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.357006 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.357322 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.357420 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.393272 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6djzf"] Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.393935 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.394024 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.412446 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.433316 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436892 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.437013 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.455771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.478937 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.499744 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.515152 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.528989 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.533324 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.533452 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtfjl\" (UniqueName: \"kubernetes.io/projected/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-kube-api-access-xtfjl\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.540921 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.540966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.540981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.541012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.541025 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.549239 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.562667 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.575638 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.587764 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.603310 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.619148 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.634791 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.634850 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtfjl\" (UniqueName: \"kubernetes.io/projected/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-kube-api-access-xtfjl\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.635090 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.635248 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:39.135208128 +0000 UTC m=+37.440985052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.637298 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645904 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.650458 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.658540 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtfjl\" (UniqueName: \"kubernetes.io/projected/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-kube-api-access-xtfjl\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.668767 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.695638 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748538 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851985 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955688 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059629 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.140698 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:39 crc kubenswrapper[4774]: E0127 00:07:39.141099 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:39 crc kubenswrapper[4774]: E0127 00:07:39.141238 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:40.141199877 +0000 UTC m=+38.446976801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163486 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266686 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.317940 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 13:59:45.028051748 +0000 UTC Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369589 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369609 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.472922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.472995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.473017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.473049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.473074 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576528 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783666 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888245 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992350 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096755 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.155098 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.155341 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.155414 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:42.155396445 +0000 UTC m=+40.461173339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200164 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304524 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.319062 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:28:19.46561291 +0000 UTC Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.356513 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.356693 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.356997 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.356975 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.357066 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.357176 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.357465 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.357646 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513684 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617532 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723782 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827534 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035166 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138825 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241972 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.319843 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 19:15:26.44868528 +0000 UTC Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346293 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.449893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.449976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.449996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.450027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.450045 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.553908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.553995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.554024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.554057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.554079 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658415 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762331 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865810 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969461 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073210 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.176917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.176990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.177013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.177064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.177221 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.179127 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.179449 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.179596 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:46.179554729 +0000 UTC m=+44.485331843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.280986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281173 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281195 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.321108 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 13:45:53.721798109 +0000 UTC Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.356077 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.356234 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.356370 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.356437 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.356703 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.356924 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.357150 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.357554 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.378276 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386527 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.395164 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.415383 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.446626 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.482033 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.504651 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.529245 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.547263 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.565597 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.583773 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.598942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599182 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.601588 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.618715 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.635227 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.657092 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.672261 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.686667 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703169 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.720198 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805842 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818741 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.840725 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846692 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.873374 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.880841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.880962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.881017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.881046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.881099 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.905813 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912430 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.934087 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941293 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.963940 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.964171 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.966909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.966986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.967007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.967046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.967073 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100420 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.321830 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 04:15:43.27442989 +0000 UTC Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412647 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515466 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722393 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826564 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.929993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930136 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033464 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137380 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137429 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241154 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.323042 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:24:04.814327586 +0000 UTC Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345143 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357162 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357307 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357446 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357476 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357693 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357846 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357990 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449358 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552281 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759852 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863210 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966904 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070736 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174694 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.277976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278087 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.323678 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 18:17:05.992677539 +0000 UTC Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.357057 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380256 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.591481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592526 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697890 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.750982 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.753771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.754554 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.775208 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.795580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801199 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.811930 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.829207 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.859915 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.877794 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.893753 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.927900 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.951485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.965979 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.986227 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007794 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.011786 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.030169 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.050333 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.070287 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.085517 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.104111 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.110994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.213935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.213982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.213998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.214023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.214041 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.233577 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.233849 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.233991 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.233957781 +0000 UTC m=+52.539734695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318205 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.324230 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 14:37:47.338057843 +0000 UTC Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356197 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356216 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356312 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356329 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356523 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356742 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356816 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356924 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421153 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421310 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525483 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628140 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.731596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.731976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.732010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.732044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.732067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835267 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939502 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.042839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.042951 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.042973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.043007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.043029 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146661 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.250017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.324678 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 16:20:38.767977594 +0000 UTC Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354450 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458623 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665946 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769457 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872694 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976728 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.183808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.183947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.183972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.184009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.184037 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287349 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.325627 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 22:01:25.560990453 +0000 UTC Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356462 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356611 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.356664 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.356849 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.357095 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.357238 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.390946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391114 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494173 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494217 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597565 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702376 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805718 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.908989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909153 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012654 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115515 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218452 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321727 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.326351 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 18:20:45.149978856 +0000 UTC Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425512 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636997 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740969 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844628 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947505 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051444 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155791 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259646 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259670 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.326942 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:10:15.389749797 +0000 UTC Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356543 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356630 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356666 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356564 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.356800 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.357029 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.357189 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.357373 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362642 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466657 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570301 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881911 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985912 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089546 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193544 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297548 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.327387 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 19:52:23.569776092 +0000 UTC Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401653 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504620 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608459 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.712047 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814988 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918571 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021975 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.125959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126131 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230668 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.328148 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 01:12:08.43469906 +0000 UTC Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356116 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356169 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356223 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356372 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.357802 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.358178 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.358445 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.358604 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.363069 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.396025 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.414295 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445269 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.450806 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.504954 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.518847 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.534117 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.547347 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548264 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.564703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.582481 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.595596 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.609718 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.623610 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.637586 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651642 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.653128 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.667968 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.681767 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.697024 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.756936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.756995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.757010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.757041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.757055 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.784392 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.786113 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.787361 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.802609 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.823106 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.839375 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.855268 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860471 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860559 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.877717 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.896315 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.913703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.928286 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.942922 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.961207 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.963010 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.972269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.985540 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.003078 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013484 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.026428 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.029732 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033348 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.044555 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.051017 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054526 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.060317 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.069734 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.073915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.073971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.073987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.074011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.074033 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.075328 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.088108 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.110027 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.110206 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112744 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216508 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320630 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.328565 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:42:23.361095693 +0000 UTC Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423697 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526420 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629823 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.792653 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.793720 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.798247 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" exitCode=1 Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.798335 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.798445 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.799807 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.800198 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.828709 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834787 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.848617 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.866727 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.880142 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.900051 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.916392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.933740 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938848 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.953127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.971045 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.985173 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.007541 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.027499 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043439 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.044081 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.063237 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.087438 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.100335 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.114540 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.135107 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.135279 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.135406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135510 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.135408613 +0000 UTC m=+84.441185507 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135582 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135612 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135718 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.13569313 +0000 UTC m=+84.441470024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135771 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.135730941 +0000 UTC m=+84.441508005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.237205 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.237347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.237403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237684 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237811 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237850 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237913 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237953 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:10.237903956 +0000 UTC m=+68.543680870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237702 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238008 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.237975328 +0000 UTC m=+84.543752402 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238102 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238151 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238307 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.238257955 +0000 UTC m=+84.544035029 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249280 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.329356 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 20:54:44.987034761 +0000 UTC Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352938 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356268 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356287 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.356443 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356550 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.356756 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.357395 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.357584 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.456917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.456997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.457063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.457705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.457769 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664597 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664614 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767922 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.805327 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.810526 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.812505 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.812787 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.825131 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.833509 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.857580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871252 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.876487 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.901625 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.927447 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.948581 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.968242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974907 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.987992 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.002098 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.018316 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.040473 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.056896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.074202 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.077094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078561 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.091957 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.115319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.127934 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.139560 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.154030 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.166985 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181296 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.186541 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.220659 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.238094 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.260403 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.276988 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.285930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286095 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.291521 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.309306 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.324243 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.329813 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 22:46:03.845419769 +0000 UTC Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.339469 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.355063 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.371319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.388152 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389996 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.406179 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.420849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.444788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.459206 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493689 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597651 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701306 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804516 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909597 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909627 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015155 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118312 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221696 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324887 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.330483 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 16:45:31.613671557 +0000 UTC Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356163 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356272 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356282 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356289 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356368 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356531 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356690 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356800 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427434 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.708181 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.726124 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735958 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.744528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.760336 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.773971 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.787605 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.800716 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.810185 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.823351 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838817 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.840685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.855896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.869348 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.894052 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.905099 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.919753 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.934524 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941931 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.954644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.978681 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.996062 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.044928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045071 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147830 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251529 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251550 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.331597 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 08:32:08.645048274 +0000 UTC Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.354795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.354948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.354974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.355012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.355038 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458713 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562614 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669574 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771850 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874411 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.977909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.977984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.978004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.978031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.978050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081464 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184785 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.286966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287059 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.332647 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 14:43:06.411124686 +0000 UTC Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356113 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356157 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356122 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356287 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356434 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356522 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356513 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389476 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594147 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798730 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901775 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004630 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106998 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209634 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312633 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.333595 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 21:05:13.868922466 +0000 UTC Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.414919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415340 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518566 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621283 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723843 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826426 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.928987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929832 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136299 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238543 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.334374 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 11:23:33.352653196 +0000 UTC Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342567 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.355897 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.356012 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356047 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356204 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.356298 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356431 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.356550 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356637 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.445661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446310 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653253 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756959 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066756 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169441 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.272955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273640 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.335179 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 03:05:08.33281967 +0000 UTC Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377306 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.479964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480120 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.584017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.686900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687842 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791281 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.893926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.893980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.893989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.894010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.894021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996866 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099142 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202215 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305121 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305257 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.336510 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 15:47:16.857519182 +0000 UTC Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.355828 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.355905 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.355961 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356080 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.356129 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356286 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356432 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356613 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.380161 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.393191 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.406362 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407779 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.417532 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.433756 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.458004 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.472078 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.484206 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.495741 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.505217 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510352 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.517551 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.536221 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.550292 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.562911 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.573312 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.585093 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.597320 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.605946 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613162 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818814 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922205 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.024962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025036 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127880 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.128023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231633 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334966 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.337025 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 17:48:54.960392019 +0000 UTC Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444224 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446367 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.470456 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.476429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.476656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.476797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.477033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.477186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.499150 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506245 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.528792 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.558402 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563980 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.586481 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.586617 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589667 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.692946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.692996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.693007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.693024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.693035 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795514 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898928 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001412 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104424 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207565 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311388 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.337459 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:16:16.964324352 +0000 UTC Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.355987 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.356106 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.355995 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.355995 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356236 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356386 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356566 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356737 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413987 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516710 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.620823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.620942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.620962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.621000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.621021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725932 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829620 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037840 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141391 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246888 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.338545 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 21:48:53.290432675 +0000 UTC Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.357426 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:08:05 crc kubenswrapper[4774]: E0127 00:08:05.357647 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.453000 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555557 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658644 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.761957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762129 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865289 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.968983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969122 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073683 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177280 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280931 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280980 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.339189 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 13:56:19.857443462 +0000 UTC Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356700 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356891 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357134 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357273 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357479 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357669 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383903 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487305 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487374 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590936 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693243 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796200 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.898985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899122 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001680 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104427 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310465 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.339331 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 19:34:28.846530598 +0000 UTC Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413983 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517550 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.620998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621082 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724285 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827571 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935686 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935730 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038703 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.140998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.339932 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 11:24:35.655453313 +0000 UTC Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347552 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.355962 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.355983 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.356049 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.356110 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356100 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356202 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356321 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356551 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450875 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.657949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.657996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.658021 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.658040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.658051 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761807 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.967966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968052 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.070995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071123 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.174001 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.276962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.276999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.277009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.277025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.277035 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.340550 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:01:47.802081129 +0000 UTC Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481909 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584374 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686998 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789825 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892573 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995126 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098753 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201428 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.308960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309075 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.334954 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.335112 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.335245 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:42.335220546 +0000 UTC m=+100.640997430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.341333 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 05:27:09.31000823 +0000 UTC Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.356943 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.356999 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.357027 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357149 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357237 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.357286 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357317 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357453 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412527 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515312 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618399 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725441 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828927 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932085 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932129 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139454 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139483 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.342274 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 04:13:30.260487847 +0000 UTC Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345205 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449641 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552933 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759380 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759410 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862401 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.876819 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/0.log" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.876899 4774 generic.go:334] "Generic (PLEG): container finished" podID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" containerID="62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b" exitCode=1 Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.876938 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerDied","Data":"62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.877393 4774 scope.go:117] "RemoveContainer" containerID="62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.895737 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.914936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.932097 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.960163 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965370 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965393 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.977746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.992818 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.009440 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.021466 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.032666 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.042153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.054943 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068157 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.069899 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.081285 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.094110 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.107970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.119127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.145230 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.158382 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170678 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273953 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.343911 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 02:37:16.977411306 +0000 UTC Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356358 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356373 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356496 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356549 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356585 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356679 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356698 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356742 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.368287 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377340 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.378208 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.394434 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.407008 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.420788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.431271 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.449657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.470735 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480374 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.483055 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.494469 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.505577 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.516338 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.528140 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.540121 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.552236 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.562128 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.573989 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585628 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.587000 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687743 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.883758 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/0.log" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.883847 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.920466 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.929973 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.940448 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.950685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.963795 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.983285 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.996459 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998367 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.008966 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.021201 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.041211 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.076491 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.089647 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.101839 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.112838 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.122121 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.133269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.144819 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.154685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250265 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.344245 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 04:26:40.392832452 +0000 UTC Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353787 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457316 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561294 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.679461 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685507 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.707201 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711520 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.726164 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730928 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.743805 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749279 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.762848 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.763017 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868380 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868418 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971424 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074646 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.178052 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.345049 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 00:39:39.921486423 +0000 UTC Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357102 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357307 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357119 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357399 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357443 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357102 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357522 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.384972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385084 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488493 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590601 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693515 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795987 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898268 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000892 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103235 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205931 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.308941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.308988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.308997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.309012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.309023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.345283 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:50:15.169470141 +0000 UTC Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413308 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413320 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516313 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722603 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825592 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133149 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236203 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338407 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.345360 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 18:10:12.204833182 +0000 UTC Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.355907 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.355969 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.355971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356058 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.356108 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356210 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356343 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356457 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441200 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544429 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655894 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758375 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.859988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860083 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962674 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065390 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.167631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168357 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271794 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.346468 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 09:50:17.486341569 +0000 UTC Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374826 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477695 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580630 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.683973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684095 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.786913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.786987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.787009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.787039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.787062 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.890800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.994417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.994916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.995209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.995416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.995610 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098733 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202946 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.346915 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 05:54:32.336339351 +0000 UTC Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356366 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356390 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356470 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.356803 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356844 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.357030 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.357202 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.357327 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.358607 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409265 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513503 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615789 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718400 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.820979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.906185 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.913436 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.913948 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.927802 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.939610 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.948712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.959896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.978317 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.990494 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.003458 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.017101 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025890 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.035379 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.047676 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.061670 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.075826 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.086465 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.097977 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.108280 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.119188 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.127970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128060 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.129762 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.139288 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231199 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334140 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.347445 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 20:11:07.194924017 +0000 UTC Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436881 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641394 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846563 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.917235 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.917764 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920070 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" exitCode=1 Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920111 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920163 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920964 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:19 crc kubenswrapper[4774]: E0127 00:08:19.921218 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.941090 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948686 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.956830 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.966603 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.983730 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.998742 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.013307 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.030154 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.052364 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054499 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.066925 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.077792 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.088930 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.101999 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.112489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.125269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.138644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.151672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156765 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.162644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.173561 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259207 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.347953 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 05:07:11.48996847 +0000 UTC Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356436 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356487 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356515 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356447 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356655 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356754 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356847 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356937 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361550 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463663 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565837 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668802 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874201 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.924367 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.929202 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.929834 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.940528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.956419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.964524 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.973574 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976635 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.000122 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.008959 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.016982 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.035329 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.049371 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.063839 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.075755 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078994 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.091156 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.106023 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.119153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.132772 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.145098 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.158613 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.170092 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184159 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.348594 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 15:23:02.115257284 +0000 UTC Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390192 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492425 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595405 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.801964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802054 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905116 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007771 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110362 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317418 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.349687 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:23:24.4486159 +0000 UTC Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356145 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356210 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356262 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356381 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356453 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356587 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356717 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356780 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.371354 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.387618 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.402605 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.422412 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.440250 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.460258 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.475393 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.493621 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.505853 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.518889 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523911 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523952 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.542657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.554937 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.566616 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.584985 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.607346 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.624081 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626265 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.644760 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.659602 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729680 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832990 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.037940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.037999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.038019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.038039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.038051 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141962 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.244915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.244984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.244997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.245015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.245048 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347472 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.350572 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 21:22:12.743859777 +0000 UTC Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449688 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552137 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.654943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.654994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.655024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.655039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.655049 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758716 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861796 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964483 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059763 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.070958 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074190 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.084351 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087664 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.098472 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101487 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.112029 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115232 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.126776 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.126955 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128458 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.230651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.230919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.230933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.231038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.231054 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.333770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.333921 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.333943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.334482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.334539 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.351215 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 01:43:33.034718463 +0000 UTC Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.356782 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.356839 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.356810 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.357040 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357028 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357158 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357259 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357394 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437373 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540451 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540487 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643555 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746661 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849347 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.952927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.952997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.953014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.953038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.953059 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056470 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159356 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159373 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262440 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.351444 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 06:57:05.409691664 +0000 UTC Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365566 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469156 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573247 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676597 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778580 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881667 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189910 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.205845 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206051 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.2060179 +0000 UTC m=+148.511794824 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.206118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206258 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.206267 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206327 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.206309617 +0000 UTC m=+148.512086521 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206451 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206600 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.206583355 +0000 UTC m=+148.512360289 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292578 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.307145 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.307187 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307335 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307359 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307371 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307420 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.307402991 +0000 UTC m=+148.613179885 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307552 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307618 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307638 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307713 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.307693158 +0000 UTC m=+148.613470082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.352183 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 17:21:55.672530673 +0000 UTC Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356618 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356700 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.356800 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356849 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356896 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.356978 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.357108 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.357357 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395338 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498381 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704806 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807845 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913929 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016804 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119809 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223818 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.328633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329213 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.353208 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 08:21:34.205648199 +0000 UTC Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432979 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534811 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534880 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637931 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637942 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741138 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844213 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946936 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049197 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152268 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.254994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255066 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.354381 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:20:08.750380122 +0000 UTC Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355786 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355786 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355853 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.355967 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.356045 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.356131 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.356241 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357680 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459805 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562354 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665967 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768468 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768572 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871785 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975555 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078166 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181361 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.354649 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:55:06.289352394 +0000 UTC Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387256 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490437 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593602 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698410 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801917 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904733 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007406 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110470 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214340 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317094 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355478 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 18:01:58.924087451 +0000 UTC Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355632 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355688 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355693 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355632 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.355796 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.355978 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.356061 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.356125 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.420823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.420991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.421022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.421077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.421105 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524385 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.730894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.730973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.730986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.731007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.731020 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834521 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938219 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041496 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.144908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.144999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.145009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.145035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.145051 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.248956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249092 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352314 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.356508 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 23:03:21.817513257 +0000 UTC Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.372838 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454719 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557811 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662837 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767461 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.871763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.871896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.871929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.872007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.872033 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974916 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078855 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182389 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285329 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356136 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356213 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356162 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356205 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356325 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356525 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356620 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356674 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356736 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:41:23.282697993 +0000 UTC Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.373124 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.385672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388328 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.400902 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.421049 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.440199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.453687 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.480790 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491218 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.504712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.521178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.540437 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.557654 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.579102 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.593370 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8953bcc-27ee-4be1-a990-9f559ea6bbc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0e4428feac8787d889cfa23715983ac112bac861e3e00ab7ae19ec676964eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.607397 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.619722 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.631071 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.641904 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.665805 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.679702 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.699118 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.801997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802064 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905170 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009155 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217181 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320375 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.358001 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 06:30:47.952168566 +0000 UTC Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424114 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526402 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629513 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732549 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835225 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938326 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.040909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.040967 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.040980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.041002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.041016 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254682 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264411 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.277826 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280982 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.293389 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.299984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300115 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.314193 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.328763 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333354 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.346500 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.346729 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356020 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356062 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356093 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356036 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356180 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356283 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356429 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356548 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357884 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.358140 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:48:53.728382865 +0000 UTC Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460387 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460501 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563805 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667257 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770813 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978248 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081431 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185368 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289938 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.357679 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:35 crc kubenswrapper[4774]: E0127 00:08:35.357975 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.358263 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 05:49:55.50460727 +0000 UTC Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393334 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496977 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704690 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807793 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.912122 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.015351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.015776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.016003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.016171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.016364 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.120555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121808 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225644 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225705 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329230 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.355923 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.355960 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.356011 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356140 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356223 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356379 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.355957 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356520 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.358629 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 04:31:05.696620587 +0000 UTC Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.431962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.432558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.432759 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.433004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.433240 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.538050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642271 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746390 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849988 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058136 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162708 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162939 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.266422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.266786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.267098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.267264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.267419 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.359118 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 21:29:52.765344455 +0000 UTC Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371237 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576431 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680246 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783604 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886384 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989118 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091710 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194305 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296644 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296758 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356247 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356318 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356442 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356443 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356594 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356710 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356827 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356973 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.359206 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 17:02:54.561244176 +0000 UTC Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398846 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603789 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603818 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707239 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913998 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017965 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.018120 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.120741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.120971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.120988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.121009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.121025 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224472 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327408 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.359329 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 01:38:46.831525133 +0000 UTC Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.429906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.429966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.429981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.430003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.430019 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.532987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533151 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637489 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740776 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843757 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946884 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049284 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151751 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254746 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355643 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355647 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355701 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355727 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356410 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356547 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356603 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356623 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358083 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.359445 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:24:39.667592291 +0000 UTC Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462283 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565358 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668215 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.770994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.771980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.772025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.772061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.772098 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978486 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081908 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184691 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.360394 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:02:39.85864256 +0000 UTC Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.390645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.390944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.390976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.391003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.391020 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.493991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494674 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.599013 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701996 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805830 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.908918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.908988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.909006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.909029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.909047 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.012957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013063 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219288 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323231 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.343047 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.343201 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.343253 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:46.343239025 +0000 UTC m=+164.649015909 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.356480 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.356611 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.356688 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.356889 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.356952 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.357175 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.357513 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.357614 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.360601 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 03:51:18.728686287 +0000 UTC Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.402066 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.402038951 podStartE2EDuration="48.402038951s" podCreationTimestamp="2026-01-27 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.40202761 +0000 UTC m=+100.707804584" watchObservedRunningTime="2026-01-27 00:08:42.402038951 +0000 UTC m=+100.707815865" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426589 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426601 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.430261 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=74.430226652 podStartE2EDuration="1m14.430226652s" podCreationTimestamp="2026-01-27 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.429290667 +0000 UTC m=+100.735067561" watchObservedRunningTime="2026-01-27 00:08:42.430226652 +0000 UTC m=+100.736003576" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.518194 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podStartSLOduration=80.518162113 podStartE2EDuration="1m20.518162113s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.494024136 +0000 UTC m=+100.799801030" watchObservedRunningTime="2026-01-27 00:08:42.518162113 +0000 UTC m=+100.823939037" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.519284 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mtz9l" podStartSLOduration=80.519271091 podStartE2EDuration="1m20.519271091s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.517600408 +0000 UTC m=+100.823377382" watchObservedRunningTime="2026-01-27 00:08:42.519271091 +0000 UTC m=+100.825048015" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.529669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530492 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.544171 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.544134757 podStartE2EDuration="1m19.544134757s" podCreationTimestamp="2026-01-27 00:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.543951932 +0000 UTC m=+100.849728826" watchObservedRunningTime="2026-01-27 00:08:42.544134757 +0000 UTC m=+100.849911701" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.598177 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g4cnl" podStartSLOduration=80.598147519 podStartE2EDuration="1m20.598147519s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.578809687 +0000 UTC m=+100.884586611" watchObservedRunningTime="2026-01-27 00:08:42.598147519 +0000 UTC m=+100.903924443" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.633680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.639163 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" podStartSLOduration=80.639143062 podStartE2EDuration="1m20.639143062s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.598679662 +0000 UTC m=+100.904456556" watchObservedRunningTime="2026-01-27 00:08:42.639143062 +0000 UTC m=+100.944920006" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.645527 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.645509428 podStartE2EDuration="1m20.645509428s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.645281841 +0000 UTC m=+100.951058755" watchObservedRunningTime="2026-01-27 00:08:42.645509428 +0000 UTC m=+100.951286372" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.660416 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8mtkj" podStartSLOduration=80.66039056299999 podStartE2EDuration="1m20.660390563s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.659900151 +0000 UTC m=+100.965677035" watchObservedRunningTime="2026-01-27 00:08:42.660390563 +0000 UTC m=+100.966167467" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.706933 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=11.70691606 podStartE2EDuration="11.70691606s" podCreationTimestamp="2026-01-27 00:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.685899475 +0000 UTC m=+100.991676379" watchObservedRunningTime="2026-01-27 00:08:42.70691606 +0000 UTC m=+101.012692944" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737200 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.772920 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-57k5g" podStartSLOduration=80.772896753 podStartE2EDuration="1m20.772896753s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.747202686 +0000 UTC m=+101.052979590" watchObservedRunningTime="2026-01-27 00:08:42.772896753 +0000 UTC m=+101.078673647" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839551 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959247 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.061984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164708 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164766 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267547 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.361223 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 03:34:09.736157344 +0000 UTC Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473799 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576915 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680582 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888698 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992366 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095547 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.198909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.198969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.198994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.199013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.199025 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303254 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.355755 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.355803 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.355913 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.356147 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356309 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356507 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356643 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356823 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.362098 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 13:21:33.326757066 +0000 UTC Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405569 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460666 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.535591 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5"] Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.536201 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.540509 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.541016 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.541055 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.541364 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577408 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e07c9c1-5915-4d23-808f-ea9f32dd336b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e07c9c1-5915-4d23-808f-ea9f32dd336b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577569 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577607 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e07c9c1-5915-4d23-808f-ea9f32dd336b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577707 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.678843 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e07c9c1-5915-4d23-808f-ea9f32dd336b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.678995 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e07c9c1-5915-4d23-808f-ea9f32dd336b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679034 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e07c9c1-5915-4d23-808f-ea9f32dd336b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679078 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679152 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679266 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679305 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.680985 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e07c9c1-5915-4d23-808f-ea9f32dd336b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.688664 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e07c9c1-5915-4d23-808f-ea9f32dd336b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.710300 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e07c9c1-5915-4d23-808f-ea9f32dd336b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.861931 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.015746 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" event={"ID":"8e07c9c1-5915-4d23-808f-ea9f32dd336b","Type":"ContainerStarted","Data":"b3bb617afafc5f78c5e03844200edf310dba21eb0905965334eb25f5650c05b2"} Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.363080 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 01:54:29.890262752 +0000 UTC Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.363162 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.373682 4774 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.022441 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" event={"ID":"8e07c9c1-5915-4d23-808f-ea9f32dd336b","Type":"ContainerStarted","Data":"e95407216eda41a7ac427282f895fd215892e27a7669a58b3031682c532b4032"} Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.043433 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" podStartSLOduration=84.043406519 podStartE2EDuration="1m24.043406519s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:46.043176574 +0000 UTC m=+104.348953548" watchObservedRunningTime="2026-01-27 00:08:46.043406519 +0000 UTC m=+104.349183413" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356562 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356677 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.356754 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356571 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356568 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.357067 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.357124 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.357249 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.356513 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.356719 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.356790 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.357085 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.357305 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.357701 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.357783 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.357970 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.357703 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.358006 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356121 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356289 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356537 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356651 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356696 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356801 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356842 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.355915 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.355939 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.355939 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.357705 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.357691 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.357847 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.357891 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.358066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.356743 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.356740 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.356761 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.357056 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357070 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357290 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357502 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357595 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355669 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355724 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355665 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355770 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.355804 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.356037 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.356060 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.356107 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065030 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065401 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/0.log" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065438 4774 generic.go:334] "Generic (PLEG): container finished" podID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" exitCode=1 Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065475 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerDied","Data":"01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972"} Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065528 4774 scope.go:117] "RemoveContainer" containerID="62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.066195 4774 scope.go:117] "RemoveContainer" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.066474 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mtz9l_openshift-multus(0abcf78e-9b05-4b89-94f3-4d3230886ce0)\"" pod="openshift-multus/multus-mtz9l" podUID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.355621 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.355779 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356031 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.356070 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356133 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.356071 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356226 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356377 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:59 crc kubenswrapper[4774]: I0127 00:08:59.072144 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356376 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357083 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356405 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357187 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356384 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356398 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357271 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.357326 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357385 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.084380 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.087652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.088110 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.114907 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podStartSLOduration=99.114830289 podStartE2EDuration="1m39.114830289s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:01.114313455 +0000 UTC m=+119.420090369" watchObservedRunningTime="2026-01-27 00:09:01.114830289 +0000 UTC m=+119.420607203" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.212454 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6djzf"] Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.212631 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:01 crc kubenswrapper[4774]: E0127 00:09:01.212780 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:02 crc kubenswrapper[4774]: I0127 00:09:02.355884 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:02 crc kubenswrapper[4774]: I0127 00:09:02.355909 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:02 crc kubenswrapper[4774]: I0127 00:09:02.358408 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.358440 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.358657 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.358685 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.370607 4774 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.457320 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 00:09:03 crc kubenswrapper[4774]: I0127 00:09:03.355818 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:03 crc kubenswrapper[4774]: E0127 00:09:03.356326 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:04 crc kubenswrapper[4774]: I0127 00:09:04.356392 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:04 crc kubenswrapper[4774]: I0127 00:09:04.356475 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:04 crc kubenswrapper[4774]: I0127 00:09:04.356397 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:04 crc kubenswrapper[4774]: E0127 00:09:04.356626 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:04 crc kubenswrapper[4774]: E0127 00:09:04.356729 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:04 crc kubenswrapper[4774]: E0127 00:09:04.357033 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:05 crc kubenswrapper[4774]: I0127 00:09:05.356484 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:05 crc kubenswrapper[4774]: E0127 00:09:05.356763 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:06 crc kubenswrapper[4774]: I0127 00:09:06.356447 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:06 crc kubenswrapper[4774]: I0127 00:09:06.356579 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:06 crc kubenswrapper[4774]: I0127 00:09:06.356702 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:06 crc kubenswrapper[4774]: E0127 00:09:06.356700 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:06 crc kubenswrapper[4774]: E0127 00:09:06.356946 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:06 crc kubenswrapper[4774]: E0127 00:09:06.357070 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:07 crc kubenswrapper[4774]: I0127 00:09:07.355598 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:07 crc kubenswrapper[4774]: E0127 00:09:07.355938 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:07 crc kubenswrapper[4774]: E0127 00:09:07.458753 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 00:09:08 crc kubenswrapper[4774]: I0127 00:09:08.356122 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:08 crc kubenswrapper[4774]: I0127 00:09:08.356173 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:08 crc kubenswrapper[4774]: I0127 00:09:08.356245 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:08 crc kubenswrapper[4774]: E0127 00:09:08.356430 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:08 crc kubenswrapper[4774]: E0127 00:09:08.356614 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:08 crc kubenswrapper[4774]: E0127 00:09:08.356820 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:09 crc kubenswrapper[4774]: I0127 00:09:09.355814 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:09 crc kubenswrapper[4774]: E0127 00:09:09.356045 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.356292 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.356308 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:10 crc kubenswrapper[4774]: E0127 00:09:10.356598 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.356935 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:10 crc kubenswrapper[4774]: E0127 00:09:10.357037 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:10 crc kubenswrapper[4774]: E0127 00:09:10.357390 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.357688 4774 scope.go:117] "RemoveContainer" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" Jan 27 00:09:11 crc kubenswrapper[4774]: I0127 00:09:11.127811 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:09:11 crc kubenswrapper[4774]: I0127 00:09:11.128255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c"} Jan 27 00:09:11 crc kubenswrapper[4774]: I0127 00:09:11.356126 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:11 crc kubenswrapper[4774]: E0127 00:09:11.356406 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:12 crc kubenswrapper[4774]: I0127 00:09:12.356142 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:12 crc kubenswrapper[4774]: I0127 00:09:12.356132 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:12 crc kubenswrapper[4774]: I0127 00:09:12.357611 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.357756 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.357842 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.357962 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.460217 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 00:09:13 crc kubenswrapper[4774]: I0127 00:09:13.355966 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:13 crc kubenswrapper[4774]: E0127 00:09:13.356133 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:14 crc kubenswrapper[4774]: I0127 00:09:14.356516 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:14 crc kubenswrapper[4774]: I0127 00:09:14.356594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:14 crc kubenswrapper[4774]: I0127 00:09:14.356601 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:14 crc kubenswrapper[4774]: E0127 00:09:14.356743 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:14 crc kubenswrapper[4774]: E0127 00:09:14.356837 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:14 crc kubenswrapper[4774]: E0127 00:09:14.356894 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:15 crc kubenswrapper[4774]: I0127 00:09:15.355558 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:15 crc kubenswrapper[4774]: E0127 00:09:15.355765 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:16 crc kubenswrapper[4774]: I0127 00:09:16.356464 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:16 crc kubenswrapper[4774]: I0127 00:09:16.356551 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:16 crc kubenswrapper[4774]: I0127 00:09:16.356461 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:16 crc kubenswrapper[4774]: E0127 00:09:16.356745 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:16 crc kubenswrapper[4774]: E0127 00:09:16.356649 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:16 crc kubenswrapper[4774]: E0127 00:09:16.357012 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:17 crc kubenswrapper[4774]: I0127 00:09:17.356098 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:17 crc kubenswrapper[4774]: E0127 00:09:17.356279 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.356485 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.356563 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.357507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360439 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360583 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360742 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360761 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 00:09:19 crc kubenswrapper[4774]: I0127 00:09:19.357146 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:19 crc kubenswrapper[4774]: I0127 00:09:19.360281 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 00:09:19 crc kubenswrapper[4774]: I0127 00:09:19.360580 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.330297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.388819 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gstl6"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.389284 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.393572 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j4bbk"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.394085 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kv45n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.394393 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.394739 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.395256 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.395449 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.396157 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.396487 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.399855 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.401041 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.404408 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.404932 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.405460 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.413244 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.413519 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414005 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414367 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414421 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414501 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415033 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415515 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415620 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415669 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415950 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415415 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415191 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415396 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.416722 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.416549 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.417115 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415462 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.418594 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419154 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419651 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419892 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419927 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.420164 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.420595 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.420986 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.422187 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.422355 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.422536 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.425932 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.426760 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.428577 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.430407 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.435843 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.439038 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.444461 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.444818 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.445334 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.445604 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.445849 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446044 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446246 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446500 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446813 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.447140 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450055 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450203 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450384 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450581 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450776 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450933 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w8qm5"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.451470 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.451998 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r9m95"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.452634 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.452908 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.468355 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.469974 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.470632 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.471496 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.483134 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.491094 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.491565 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.491925 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.492088 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.492875 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29491200-4blsg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.493121 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.493448 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494161 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494211 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-encryption-config\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494228 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l72t8\" (UniqueName: \"kubernetes.io/projected/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-kube-api-access-l72t8\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494246 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-node-pullsecrets\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494276 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494290 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-image-import-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494310 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-policies\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494334 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbdc\" (UniqueName: \"kubernetes.io/projected/98696d49-1f48-4d0d-9e26-691558f704c5-kube-api-access-nrbdc\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494351 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494367 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-client\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494386 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494402 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-audit\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494419 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-serving-cert\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/1df48642-0cb9-4242-ae92-16b239115170-kube-api-access-hj8bl\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494473 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb54417-3e44-4c36-a1cf-438a705f8dcf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494495 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-config\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494516 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-serving-cert\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btcqh\" (UniqueName: \"kubernetes.io/projected/7cab68e6-3113-4536-b5c3-2293265c9502-kube-api-access-btcqh\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494555 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz5lb\" (UniqueName: \"kubernetes.io/projected/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-kube-api-access-tz5lb\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494572 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956bdf0f-108e-4966-bbb6-77be23255cec-serving-cert\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494591 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1df48642-0cb9-4242-ae92-16b239115170-machine-approver-tls\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494606 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-dir\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494627 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494645 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494651 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494670 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494692 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-auth-proxy-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494710 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-images\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494728 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-client\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494747 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494769 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db8adc8a-118c-485d-b752-c3d0a2888458-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494813 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-config\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494830 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-trusted-ca\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494849 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59zpq\" (UniqueName: \"kubernetes.io/projected/7cb54417-3e44-4c36-a1cf-438a705f8dcf-kube-api-access-59zpq\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494886 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494900 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-config\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494915 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-service-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494933 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494964 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494982 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494997 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495014 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-encryption-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495030 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cab68e6-3113-4536-b5c3-2293265c9502-metrics-tls\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495046 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-audit-dir\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495062 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495094 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-serving-cert\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495111 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495126 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgtc\" (UniqueName: \"kubernetes.io/projected/db8adc8a-118c-485d-b752-c3d0a2888458-kube-api-access-ksgtc\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw55r\" (UniqueName: \"kubernetes.io/projected/956bdf0f-108e-4966-bbb6-77be23255cec-kube-api-access-dw55r\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.496196 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.493438 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.496978 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497030 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497139 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497220 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497287 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.501829 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502244 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502435 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502637 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502835 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503086 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503195 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503489 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503894 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.507033 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nt6rp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.507682 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.513735 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.516497 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.516764 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.516902 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.517027 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.517164 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.517970 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518093 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518192 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518290 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518381 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.522608 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.523149 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.528944 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-776sg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529287 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529450 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gstl6"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529473 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529768 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529990 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529991 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.530887 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.534916 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm9kj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.536789 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537184 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537476 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537763 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537783 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539361 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539485 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539648 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539714 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539780 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.540001 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.540705 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.541998 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.542628 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.543077 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.544602 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545151 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545316 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545747 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545877 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.548202 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-44dpr"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.558335 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.560451 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.577821 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.578007 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.578101 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.582576 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.584435 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.586580 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.589537 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.590138 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591471 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591630 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591632 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.595937 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kv45n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596035 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cab68e6-3113-4536-b5c3-2293265c9502-metrics-tls\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596455 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596809 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/459b67d7-cbf3-4890-a9a2-3e143ee459aa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596844 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35026702-de7c-4f5f-8714-f1d7f89adae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596898 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l72t8\" (UniqueName: \"kubernetes.io/projected/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-kube-api-access-l72t8\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596932 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e6f70216-3d55-4dd8-81f7-f2129a277407-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596991 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596903 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z5sz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.597291 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.597838 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.599975 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.601022 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-node-pullsecrets\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.601133 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-node-pullsecrets\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.601193 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602006 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckh7l\" (UniqueName: \"kubernetes.io/projected/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-kube-api-access-ckh7l\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602090 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbdc\" (UniqueName: \"kubernetes.io/projected/98696d49-1f48-4d0d-9e26-691558f704c5-kube-api-access-nrbdc\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602128 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602156 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b20433-569d-4f1d-acd8-127119a934e1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602177 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftjqz\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-kube-api-access-ftjqz\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602219 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-service-ca\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602240 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-audit\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-serving-cert\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602286 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-serving-cert\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btcqh\" (UniqueName: \"kubernetes.io/projected/7cab68e6-3113-4536-b5c3-2293265c9502-kube-api-access-btcqh\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602329 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz5lb\" (UniqueName: \"kubernetes.io/projected/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-kube-api-access-tz5lb\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602350 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956bdf0f-108e-4966-bbb6-77be23255cec-serving-cert\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602372 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-metrics-certs\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602393 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602418 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5be86044-916e-44ba-9855-360b0ae2471a-proxy-tls\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602465 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lnl\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-kube-api-access-b9lnl\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-client\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602502 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-config\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602520 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602561 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e83c76a-d44d-4a1b-b904-857dad56b5ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602591 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602616 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd12b52-828d-4244-99e9-2291f3a0bbb6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602640 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-auth-proxy-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602661 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-images\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602683 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602718 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-serving-cert\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602741 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-stats-auth\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602784 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602803 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-trusted-ca-bundle\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602822 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db8adc8a-118c-485d-b752-c3d0a2888458-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602875 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-trusted-ca\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602896 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602914 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwq54\" (UniqueName: \"kubernetes.io/projected/193a08ea-86ea-4176-898e-6a2c476ea6e9-kube-api-access-nwq54\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602935 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602952 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-service-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-config\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602990 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603011 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b20433-569d-4f1d-acd8-127119a934e1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603030 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vrd\" (UniqueName: \"kubernetes.io/projected/607f973b-fc78-4c11-bc09-fdbbe414d8c7-kube-api-access-c9vrd\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603048 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzc9x\" (UniqueName: \"kubernetes.io/projected/5be86044-916e-44ba-9855-360b0ae2471a-kube-api-access-fzc9x\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603068 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603088 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607f973b-fc78-4c11-bc09-fdbbe414d8c7-service-ca-bundle\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603109 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603129 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79th\" (UniqueName: \"kubernetes.io/projected/ec68f29b-2b30-4dff-b875-7617466be51b-kube-api-access-q79th\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603148 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c9ce6bb-224d-498c-9299-762d84b0eaa3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603167 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-encryption-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603183 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-images\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-audit-dir\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603218 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603245 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-serving-cert\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603261 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603277 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgtc\" (UniqueName: \"kubernetes.io/projected/db8adc8a-118c-485d-b752-c3d0a2888458-kube-api-access-ksgtc\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603295 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw55r\" (UniqueName: \"kubernetes.io/projected/956bdf0f-108e-4966-bbb6-77be23255cec-kube-api-access-dw55r\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603313 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e83c76a-d44d-4a1b-b904-857dad56b5ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603333 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-encryption-config\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603367 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd12b52-828d-4244-99e9-2291f3a0bbb6-config\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603397 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-image-import-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603437 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-policies\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603460 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603480 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-client\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603498 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603518 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244zr\" (UniqueName: \"kubernetes.io/projected/05b20433-569d-4f1d-acd8-127119a934e1-kube-api-access-244zr\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603539 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc8l7\" (UniqueName: \"kubernetes.io/projected/475a4aef-33e7-40ad-b7c4-20a10efa4ec3-kube-api-access-cc8l7\") pod \"downloads-7954f5f757-nt6rp\" (UID: \"475a4aef-33e7-40ad-b7c4-20a10efa4ec3\") " pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603559 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/1df48642-0cb9-4242-ae92-16b239115170-kube-api-access-hj8bl\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603575 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb54417-3e44-4c36-a1cf-438a705f8dcf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603610 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-config\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/459b67d7-cbf3-4890-a9a2-3e143ee459aa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1df48642-0cb9-4242-ae92-16b239115170-machine-approver-tls\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603667 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-dir\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603683 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-oauth-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec68f29b-2b30-4dff-b875-7617466be51b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-oauth-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603761 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-default-certificate\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603782 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603800 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhl8d\" (UniqueName: \"kubernetes.io/projected/7e83c76a-d44d-4a1b-b904-857dad56b5ba-kube-api-access-bhl8d\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603820 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603839 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c9ce6bb-224d-498c-9299-762d84b0eaa3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.604791 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-serving-cert\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.605422 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-audit\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602375 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.606915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.607262 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.607440 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.607982 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-auth-proxy-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.608133 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.608819 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-image-import-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.608992 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.609100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-images\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.609394 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-policies\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.610101 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611092 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-encryption-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611140 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-audit-dir\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611371 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-serving-cert\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611793 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.612657 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.613215 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-client\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.614284 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-config\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615208 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-dir\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615606 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615830 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616114 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-client\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616168 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616202 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459b67d7-cbf3-4890-a9a2-3e143ee459aa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616226 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616236 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616257 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4648\" (UniqueName: \"kubernetes.io/projected/e6f70216-3d55-4dd8-81f7-f2129a277407-kube-api-access-q4648\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616281 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616362 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-config\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616385 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616411 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616434 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59zpq\" (UniqueName: \"kubernetes.io/projected/7cb54417-3e44-4c36-a1cf-438a705f8dcf-kube-api-access-59zpq\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616455 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-config\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616477 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f70216-3d55-4dd8-81f7-f2129a277407-serving-cert\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616497 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd12b52-828d-4244-99e9-2291f3a0bbb6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616518 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616553 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616579 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616606 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-service-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.617294 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-config\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.617485 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618042 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-config\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618083 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-serving-cert\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618108 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618137 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35026702-de7c-4f5f-8714-f1d7f89adae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618161 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618182 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.619446 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.619656 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cab68e6-3113-4536-b5c3-2293265c9502-metrics-tls\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.620911 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.621667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.621675 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-service-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622260 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622384 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622568 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623291 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-trusted-ca\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623315 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623414 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956bdf0f-108e-4966-bbb6-77be23255cec-serving-cert\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623876 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb54417-3e44-4c36-a1cf-438a705f8dcf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.624925 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625338 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625723 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625745 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1df48642-0cb9-4242-ae92-16b239115170-machine-approver-tls\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625936 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.626055 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-encryption-config\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.626082 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.626842 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.627233 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-client\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.627470 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.628225 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.630227 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.630438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db8adc8a-118c-485d-b752-c3d0a2888458-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.630713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.631743 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8kqjp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.632652 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.633562 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.643584 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.649251 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.651948 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j4bbk"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.652065 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.653069 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nt6rp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.654352 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r9m95"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.655583 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29491200-4blsg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.658088 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w8qm5"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.659190 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ttgpc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.659920 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.660255 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.661361 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.662426 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm9kj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.663635 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.664669 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.665925 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.666974 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.668159 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.669151 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.672529 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.674410 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.676240 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.679977 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.682015 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-776sg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.683693 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.684822 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.685896 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.686888 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.687928 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.689195 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z5sz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.690255 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.691234 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8kqjp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.692262 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.693622 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m59xl"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.694984 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-th968"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.695147 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.696394 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.696441 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.696581 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-th968" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.697966 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.698530 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.699554 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ttgpc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.700551 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.701747 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.702952 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m59xl"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.703949 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-th968"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.705081 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pjtrg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.705841 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.711609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719386 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckh7l\" (UniqueName: \"kubernetes.io/projected/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-kube-api-access-ckh7l\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719425 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719467 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719492 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftjqz\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-kube-api-access-ftjqz\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719514 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719534 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b20433-569d-4f1d-acd8-127119a934e1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719556 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-service-ca\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719594 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-metrics-certs\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719638 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5be86044-916e-44ba-9855-360b0ae2471a-proxy-tls\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719659 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lnl\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-kube-api-access-b9lnl\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719679 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-client\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719699 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-config\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719740 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719775 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e83c76a-d44d-4a1b-b904-857dad56b5ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719799 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd12b52-828d-4244-99e9-2291f3a0bbb6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719818 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719836 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719888 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-serving-cert\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719910 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-stats-auth\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719933 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719955 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-trusted-ca-bundle\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719978 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719998 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwq54\" (UniqueName: \"kubernetes.io/projected/193a08ea-86ea-4176-898e-6a2c476ea6e9-kube-api-access-nwq54\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720021 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-config\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720038 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720058 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b20433-569d-4f1d-acd8-127119a934e1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720077 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vrd\" (UniqueName: \"kubernetes.io/projected/607f973b-fc78-4c11-bc09-fdbbe414d8c7-kube-api-access-c9vrd\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720098 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzc9x\" (UniqueName: \"kubernetes.io/projected/5be86044-916e-44ba-9855-360b0ae2471a-kube-api-access-fzc9x\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607f973b-fc78-4c11-bc09-fdbbe414d8c7-service-ca-bundle\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720141 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c9ce6bb-224d-498c-9299-762d84b0eaa3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720163 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q79th\" (UniqueName: \"kubernetes.io/projected/ec68f29b-2b30-4dff-b875-7617466be51b-kube-api-access-q79th\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720182 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-images\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720226 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e83c76a-d44d-4a1b-b904-857dad56b5ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720250 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720301 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd12b52-828d-4244-99e9-2291f3a0bbb6-config\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720332 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244zr\" (UniqueName: \"kubernetes.io/projected/05b20433-569d-4f1d-acd8-127119a934e1-kube-api-access-244zr\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720353 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc8l7\" (UniqueName: \"kubernetes.io/projected/475a4aef-33e7-40ad-b7c4-20a10efa4ec3-kube-api-access-cc8l7\") pod \"downloads-7954f5f757-nt6rp\" (UID: \"475a4aef-33e7-40ad-b7c4-20a10efa4ec3\") " pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/459b67d7-cbf3-4890-a9a2-3e143ee459aa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-oauth-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720428 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-oauth-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720457 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec68f29b-2b30-4dff-b875-7617466be51b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720490 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-default-certificate\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720520 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhl8d\" (UniqueName: \"kubernetes.io/projected/7e83c76a-d44d-4a1b-b904-857dad56b5ba-kube-api-access-bhl8d\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720563 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c9ce6bb-224d-498c-9299-762d84b0eaa3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720591 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720596 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b20433-569d-4f1d-acd8-127119a934e1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720613 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459b67d7-cbf3-4890-a9a2-3e143ee459aa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720660 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4648\" (UniqueName: \"kubernetes.io/projected/e6f70216-3d55-4dd8-81f7-f2129a277407-kube-api-access-q4648\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720701 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720713 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720757 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720755 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-service-ca\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720787 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f70216-3d55-4dd8-81f7-f2129a277407-serving-cert\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720810 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd12b52-828d-4244-99e9-2291f3a0bbb6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720830 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720852 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720888 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-service-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35026702-de7c-4f5f-8714-f1d7f89adae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720965 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720986 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721017 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/459b67d7-cbf3-4890-a9a2-3e143ee459aa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721040 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35026702-de7c-4f5f-8714-f1d7f89adae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e6f70216-3d55-4dd8-81f7-f2129a277407-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721081 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722120 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-oauth-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e83c76a-d44d-4a1b-b904-857dad56b5ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722574 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c9ce6bb-224d-498c-9299-762d84b0eaa3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.724891 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725194 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725351 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35026702-de7c-4f5f-8714-f1d7f89adae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725847 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e6f70216-3d55-4dd8-81f7-f2129a277407-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.727142 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-trusted-ca-bundle\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.728546 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.730380 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.730440 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.731193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.731621 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.735166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.735967 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.736173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e83c76a-d44d-4a1b-b904-857dad56b5ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.736318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c9ce6bb-224d-498c-9299-762d84b0eaa3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738053 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738168 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f70216-3d55-4dd8-81f7-f2129a277407-serving-cert\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738184 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b20433-569d-4f1d-acd8-127119a934e1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738291 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-client\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738509 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738803 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-oauth-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.739495 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.752874 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.753894 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.772761 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.791312 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.811941 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.815629 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-service-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.831206 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.841171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35026702-de7c-4f5f-8714-f1d7f89adae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.851972 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.862745 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-serving-cert\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.873335 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.892518 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.897549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-config\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.911734 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.932354 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.952260 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.972353 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.006075 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.013281 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.031850 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.033436 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.053006 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.057769 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.071438 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.081724 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-config\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.112201 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.113132 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd12b52-828d-4244-99e9-2291f3a0bbb6-config\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.131840 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.152758 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.165965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd12b52-828d-4244-99e9-2291f3a0bbb6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.171846 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.193392 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.212356 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.221229 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-stats-auth\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.233167 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.245382 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-metrics-certs\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.252590 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.272012 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.277459 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-default-certificate\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.292594 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.298449 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607f973b-fc78-4c11-bc09-fdbbe414d8c7-service-ca-bundle\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.312915 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.333909 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.352779 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.372529 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.379742 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/459b67d7-cbf3-4890-a9a2-3e143ee459aa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.393337 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.396951 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459b67d7-cbf3-4890-a9a2-3e143ee459aa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.412931 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.422478 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-images\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.432901 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.447037 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec68f29b-2b30-4dff-b875-7617466be51b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.452833 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.472272 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.476694 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5be86044-916e-44ba-9855-360b0ae2471a-proxy-tls\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.491709 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.532037 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.552020 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.572777 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.592801 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.609969 4774 request.go:700] Waited for 1.010469645s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.612192 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.632485 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.673115 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l72t8\" (UniqueName: \"kubernetes.io/projected/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-kube-api-access-l72t8\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.690356 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbdc\" (UniqueName: \"kubernetes.io/projected/98696d49-1f48-4d0d-9e26-691558f704c5-kube-api-access-nrbdc\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.691831 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.712552 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.715105 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz5lb\" (UniqueName: \"kubernetes.io/projected/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-kube-api-access-tz5lb\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.732722 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.759485 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.770817 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.792569 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.793676 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btcqh\" (UniqueName: \"kubernetes.io/projected/7cab68e6-3113-4536-b5c3-2293265c9502-kube-api-access-btcqh\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.834762 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.843386 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59zpq\" (UniqueName: \"kubernetes.io/projected/7cb54417-3e44-4c36-a1cf-438a705f8dcf-kube-api-access-59zpq\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.855438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgtc\" (UniqueName: \"kubernetes.io/projected/db8adc8a-118c-485d-b752-c3d0a2888458-kube-api-access-ksgtc\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.867458 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.870381 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw55r\" (UniqueName: \"kubernetes.io/projected/956bdf0f-108e-4966-bbb6-77be23255cec-kube-api-access-dw55r\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.891754 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.894727 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/1df48642-0cb9-4242-ae92-16b239115170-kube-api-access-hj8bl\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.917249 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.932542 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.950125 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.952136 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.983825 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.992910 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.994166 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j4bbk"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.001027 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.003055 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98696d49_1f48_4d0d_9e26_691558f704c5.slice/crio-b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501 WatchSource:0}: Error finding container b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501: Status 404 returned error can't find the container with id b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501 Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.012596 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.018093 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.026534 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.032379 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.038654 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bc63c8_36a5_4c7a_bf5d_c8f93744a251.slice/crio-e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca WatchSource:0}: Error finding container e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca: Status 404 returned error can't find the container with id e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.039055 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.053716 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.054621 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.071379 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.089157 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w8qm5"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.091840 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.106808 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.113022 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.132279 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.153494 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.158718 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.165590 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gstl6"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.173077 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.193802 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc64d4fcd_470f_43e8_bab0_09c0b59eaf6d.slice/crio-f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd WatchSource:0}: Error finding container f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd: Status 404 returned error can't find the container with id f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.194065 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.207449 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" event={"ID":"7cab68e6-3113-4536-b5c3-2293265c9502","Type":"ContainerStarted","Data":"8dafee3215369519aaec1b82214b50f407e2311b31a51bec8c51f4d7d870a53c"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.209197 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" event={"ID":"1df48642-0cb9-4242-ae92-16b239115170","Type":"ContainerStarted","Data":"0d88e49e1a3e17cc6e532723981b1d1e24f28507b4012595a22573fdde0c7df3"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.210602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerStarted","Data":"20e7f45e0bc628b7de64cc93ac19fdef43b151e10efd40aab961cc3102eb973d"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.212649 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.218152 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" event={"ID":"86bc63c8-36a5-4c7a-bf5d-c8f93744a251","Type":"ContainerStarted","Data":"e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.232125 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.243602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" event={"ID":"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d","Type":"ContainerStarted","Data":"f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.249487 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerStarted","Data":"b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.252940 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.260389 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kv45n"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.272544 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.292290 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.306696 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.312536 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.332765 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.353206 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.368430 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6a02f8_e5e6_49b7_99cb_29d40ba9fdff.slice/crio-63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863 WatchSource:0}: Error finding container 63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863: Status 404 returned error can't find the container with id 63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863 Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.372852 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.392662 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.415148 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.434449 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.436694 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.454067 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.472145 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.472999 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r9m95"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.492286 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.512125 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.533700 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.551617 4774 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.572145 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.592290 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.610537 4774 request.go:700] Waited for 1.913700516s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.612513 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.633330 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.652632 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.673040 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.694218 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.712881 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.760430 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckh7l\" (UniqueName: \"kubernetes.io/projected/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-kube-api-access-ckh7l\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.775358 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.785761 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.805796 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftjqz\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-kube-api-access-ftjqz\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.808101 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lnl\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-kube-api-access-b9lnl\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.829707 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79th\" (UniqueName: \"kubernetes.io/projected/ec68f29b-2b30-4dff-b875-7617466be51b-kube-api-access-q79th\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.848537 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc8l7\" (UniqueName: \"kubernetes.io/projected/475a4aef-33e7-40ad-b7c4-20a10efa4ec3-kube-api-access-cc8l7\") pod \"downloads-7954f5f757-nt6rp\" (UID: \"475a4aef-33e7-40ad-b7c4-20a10efa4ec3\") " pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.866370 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/459b67d7-cbf3-4890-a9a2-3e143ee459aa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.885611 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.900283 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.917671 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244zr\" (UniqueName: \"kubernetes.io/projected/05b20433-569d-4f1d-acd8-127119a934e1-kube-api-access-244zr\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.926810 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.932094 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhl8d\" (UniqueName: \"kubernetes.io/projected/7e83c76a-d44d-4a1b-b904-857dad56b5ba-kube-api-access-bhl8d\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.941636 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.951974 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd12b52-828d-4244-99e9-2291f3a0bbb6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.973292 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.989325 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.017398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4648\" (UniqueName: \"kubernetes.io/projected/e6f70216-3d55-4dd8-81f7-f2129a277407-kube-api-access-q4648\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.032697 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.050387 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29491200-4blsg"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.068240 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwq54\" (UniqueName: \"kubernetes.io/projected/193a08ea-86ea-4176-898e-6a2c476ea6e9-kube-api-access-nwq54\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.068604 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzc9x\" (UniqueName: \"kubernetes.io/projected/5be86044-916e-44ba-9855-360b0ae2471a-kube-api-access-fzc9x\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.078086 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.096527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vrd\" (UniqueName: \"kubernetes.io/projected/607f973b-fc78-4c11-bc09-fdbbe414d8c7-kube-api-access-c9vrd\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.099616 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.109284 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.119283 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.150684 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm9kj"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.158290 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.164507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167197 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167255 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167301 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167353 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167383 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167410 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167436 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167494 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.168257 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.668189366 +0000 UTC m=+146.973966240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.170246 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.178567 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.201773 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.209470 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.209908 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.217458 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.234297 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.269985 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.270124 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.77009517 +0000 UTC m=+147.075872054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270249 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/697daa16-5938-4757-a941-83fa9dbb019b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270333 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-mountpoint-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270360 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270428 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270476 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-profile-collector-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270504 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270575 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270637 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c89zl\" (UniqueName: \"kubernetes.io/projected/995e8bee-a3aa-466d-8007-4d12eab0d045-kube-api-access-c89zl\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270679 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkh6\" (UniqueName: \"kubernetes.io/projected/31004ea4-e1fa-489e-a44e-701f370c9899-kube-api-access-4bkh6\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270700 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khrgg\" (UniqueName: \"kubernetes.io/projected/9e947476-a4bc-441e-97ab-2caba294339b-kube-api-access-khrgg\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270728 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270771 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wp9h\" (UniqueName: \"kubernetes.io/projected/697daa16-5938-4757-a941-83fa9dbb019b-kube-api-access-6wp9h\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-apiservice-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270884 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjzkw\" (UniqueName: \"kubernetes.io/projected/16b7780a-d64f-4a63-b462-c924d7c44aac-kube-api-access-tjzkw\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270939 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-socket-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270959 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg6q\" (UniqueName: \"kubernetes.io/projected/337a9389-b527-41a3-aba6-fada15fb648c-kube-api-access-fgg6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.271006 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.271057 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e947476-a4bc-441e-97ab-2caba294339b-serving-cert\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.271156 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337a9389-b527-41a3-aba6-fada15fb648c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.272493 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.272641 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.772631568 +0000 UTC m=+147.078408452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.273714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.274180 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-tmpfs\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.274545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-srv-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.274802 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-cabundle\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276377 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-node-bootstrap-token\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276440 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276458 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-certs\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxff\" (UniqueName: \"kubernetes.io/projected/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-kube-api-access-gxxff\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-registration-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276565 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e947476-a4bc-441e-97ab-2caba294339b-config\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276646 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fq5\" (UniqueName: \"kubernetes.io/projected/39fc27d8-358f-40a9-8128-35b2e8f96f5c-kube-api-access-t2fq5\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276707 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276731 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276755 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtcp4\" (UniqueName: \"kubernetes.io/projected/f6821801-4d19-46b9-8a38-e55810cb2dbf-kube-api-access-dtcp4\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276771 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-plugins-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276808 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-key\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276869 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276887 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47e6217b-24f0-495d-8c8e-cb684083e8dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276934 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-srv-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276952 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnnw2\" (UniqueName: \"kubernetes.io/projected/4718399b-22db-4443-bc52-22e461891f11-kube-api-access-mnnw2\") pod \"migrator-59844c95c7-97wdt\" (UID: \"4718399b-22db-4443-bc52-22e461891f11\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276987 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vcnl\" (UniqueName: \"kubernetes.io/projected/75b25a91-05cf-44c9-b656-c8a07515d84f-kube-api-access-5vcnl\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277021 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995e8bee-a3aa-466d-8007-4d12eab0d045-metrics-tls\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277049 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfhft\" (UniqueName: \"kubernetes.io/projected/30b3f92f-3bf5-448c-9542-6217fb51f239-kube-api-access-sfhft\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psmsg\" (UniqueName: \"kubernetes.io/projected/7bfd245d-eb59-40d5-b1cf-517beaa46f32-kube-api-access-psmsg\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277083 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995e8bee-a3aa-466d-8007-4d12eab0d045-config-volume\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277109 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-webhook-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277170 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337a9389-b527-41a3-aba6-fada15fb648c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9pnb\" (UniqueName: \"kubernetes.io/projected/47e6217b-24f0-495d-8c8e-cb684083e8dd-kube-api-access-p9pnb\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277673 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-csi-data-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277746 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39fc27d8-358f-40a9-8128-35b2e8f96f5c-cert\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31004ea4-e1fa-489e-a44e-701f370c9899-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277874 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31004ea4-e1fa-489e-a44e-701f370c9899-proxy-tls\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.284016 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.286690 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.288411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.291537 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.311474 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.327416 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.337379 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" event={"ID":"7cb54417-3e44-4c36-a1cf-438a705f8dcf","Type":"ContainerStarted","Data":"9aec1831ac31422db43bdee5e6e742d2fdd80c086acbd5a9cc29da470da1d7eb"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.337432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" event={"ID":"7cb54417-3e44-4c36-a1cf-438a705f8dcf","Type":"ContainerStarted","Data":"3c0392836f2dc327d5d8301bf0194f0bc3a8672e190262cb3c51dd36dcfb7803"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.337445 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" event={"ID":"7cb54417-3e44-4c36-a1cf-438a705f8dcf","Type":"ContainerStarted","Data":"f168fbc6b9f63899e12c4476d37e1c492cd33317d74be0e489ae286808a4ef57"} Jan 27 00:09:28 crc kubenswrapper[4774]: W0127 00:09:28.346775 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec68f29b_2b30_4dff_b875_7617466be51b.slice/crio-094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1 WatchSource:0}: Error finding container 094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1: Status 404 returned error can't find the container with id 094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1 Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.378739 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.379577 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.879530804 +0000 UTC m=+147.185307688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.381889 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-socket-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.381942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg6q\" (UniqueName: \"kubernetes.io/projected/337a9389-b527-41a3-aba6-fada15fb648c-kube-api-access-fgg6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.381976 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382015 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-tmpfs\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382039 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e947476-a4bc-441e-97ab-2caba294339b-serving-cert\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382062 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337a9389-b527-41a3-aba6-fada15fb648c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382093 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-srv-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382129 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-cabundle\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382157 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-node-bootstrap-token\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382180 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-certs\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382199 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxff\" (UniqueName: \"kubernetes.io/projected/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-kube-api-access-gxxff\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382222 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-registration-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382254 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e947476-a4bc-441e-97ab-2caba294339b-config\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382283 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fq5\" (UniqueName: \"kubernetes.io/projected/39fc27d8-358f-40a9-8128-35b2e8f96f5c-kube-api-access-t2fq5\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382311 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382334 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcp4\" (UniqueName: \"kubernetes.io/projected/f6821801-4d19-46b9-8a38-e55810cb2dbf-kube-api-access-dtcp4\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382355 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-plugins-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382376 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-key\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382398 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47e6217b-24f0-495d-8c8e-cb684083e8dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382423 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-srv-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382468 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnnw2\" (UniqueName: \"kubernetes.io/projected/4718399b-22db-4443-bc52-22e461891f11-kube-api-access-mnnw2\") pod \"migrator-59844c95c7-97wdt\" (UID: \"4718399b-22db-4443-bc52-22e461891f11\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382513 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vcnl\" (UniqueName: \"kubernetes.io/projected/75b25a91-05cf-44c9-b656-c8a07515d84f-kube-api-access-5vcnl\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382550 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995e8bee-a3aa-466d-8007-4d12eab0d045-metrics-tls\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382582 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfhft\" (UniqueName: \"kubernetes.io/projected/30b3f92f-3bf5-448c-9542-6217fb51f239-kube-api-access-sfhft\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382610 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psmsg\" (UniqueName: \"kubernetes.io/projected/7bfd245d-eb59-40d5-b1cf-517beaa46f32-kube-api-access-psmsg\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382635 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995e8bee-a3aa-466d-8007-4d12eab0d045-config-volume\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382646 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-tmpfs\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382677 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-webhook-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382756 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337a9389-b527-41a3-aba6-fada15fb648c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382792 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9pnb\" (UniqueName: \"kubernetes.io/projected/47e6217b-24f0-495d-8c8e-cb684083e8dd-kube-api-access-p9pnb\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382844 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39fc27d8-358f-40a9-8128-35b2e8f96f5c-cert\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382913 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-csi-data-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31004ea4-e1fa-489e-a44e-701f370c9899-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382991 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31004ea4-e1fa-489e-a44e-701f370c9899-proxy-tls\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383024 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/697daa16-5938-4757-a941-83fa9dbb019b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383088 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-mountpoint-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383117 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383158 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-profile-collector-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383188 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383291 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383323 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c89zl\" (UniqueName: \"kubernetes.io/projected/995e8bee-a3aa-466d-8007-4d12eab0d045-kube-api-access-c89zl\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383356 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkh6\" (UniqueName: \"kubernetes.io/projected/31004ea4-e1fa-489e-a44e-701f370c9899-kube-api-access-4bkh6\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383378 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khrgg\" (UniqueName: \"kubernetes.io/projected/9e947476-a4bc-441e-97ab-2caba294339b-kube-api-access-khrgg\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383424 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wp9h\" (UniqueName: \"kubernetes.io/projected/697daa16-5938-4757-a941-83fa9dbb019b-kube-api-access-6wp9h\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383451 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-apiservice-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383483 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjzkw\" (UniqueName: \"kubernetes.io/projected/16b7780a-d64f-4a63-b462-c924d7c44aac-kube-api-access-tjzkw\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383983 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-socket-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.384549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e947476-a4bc-441e-97ab-2caba294339b-config\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.384697 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-registration-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.384797 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337a9389-b527-41a3-aba6-fada15fb648c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.385324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-cabundle\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.386825 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-plugins-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.387265 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.387638 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.887620551 +0000 UTC m=+147.193397435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.388233 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.389178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-csi-data-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.389504 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-mountpoint-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.390102 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31004ea4-e1fa-489e-a44e-701f370c9899-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.390604 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995e8bee-a3aa-466d-8007-4d12eab0d045-config-volume\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.395187 4774 generic.go:334] "Generic (PLEG): container finished" podID="86bc63c8-36a5-4c7a-bf5d-c8f93744a251" containerID="2769c007da48b10c76f3468831c07b445783b7340d1320618fcd224dd716268a" exitCode=0 Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.403253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-srv-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.405775 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39fc27d8-358f-40a9-8128-35b2e8f96f5c-cert\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.424357 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e947476-a4bc-441e-97ab-2caba294339b-serving-cert\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.424462 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425019 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-srv-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.424926 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47e6217b-24f0-495d-8c8e-cb684083e8dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425440 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-key\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425489 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31004ea4-e1fa-489e-a44e-701f370c9899-proxy-tls\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425500 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337a9389-b527-41a3-aba6-fada15fb648c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425659 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995e8bee-a3aa-466d-8007-4d12eab0d045-metrics-tls\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425723 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-certs\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.433057 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-node-bootstrap-token\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.434852 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/697daa16-5938-4757-a941-83fa9dbb019b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.436311 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-apiservice-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.437161 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.439999 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-webhook-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.448333 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.448904 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-profile-collector-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.467407 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjzkw\" (UniqueName: \"kubernetes.io/projected/16b7780a-d64f-4a63-b462-c924d7c44aac-kube-api-access-tjzkw\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.470778 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.476872 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg6q\" (UniqueName: \"kubernetes.io/projected/337a9389-b527-41a3-aba6-fada15fb648c-kube-api-access-fgg6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.482163 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.484558 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.485616 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.985597105 +0000 UTC m=+147.291373989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.494664 4774 generic.go:334] "Generic (PLEG): container finished" podID="98696d49-1f48-4d0d-9e26-691558f704c5" containerID="461f3c798ef2ec6334ea60d55994e80867f8d705a04bf8eecd05f5e88be89c4e" exitCode=0 Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.502012 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxff\" (UniqueName: \"kubernetes.io/projected/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-kube-api-access-gxxff\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.507226 4774 patch_prober.go:28] interesting pod/console-operator-58897d9998-r9m95 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.507318 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-r9m95" podUID="956bdf0f-108e-4966-bbb6-77be23255cec" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerStarted","Data":"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517315 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" event={"ID":"ffb07ef8-54f3-448a-90ac-3da3e0d686ae","Type":"ContainerStarted","Data":"a28eab8188e8572bd395124f34aefe010e6daef7c7d23fae32440486658bfe60"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517327 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" event={"ID":"86bc63c8-36a5-4c7a-bf5d-c8f93744a251","Type":"ContainerDied","Data":"2769c007da48b10c76f3468831c07b445783b7340d1320618fcd224dd716268a"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517346 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517379 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517391 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517401 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" event={"ID":"1df48642-0cb9-4242-ae92-16b239115170","Type":"ContainerStarted","Data":"1de92e5e9565cd6e3aa14a05222aa73a7ad801168301409f5941e309948a72d5"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517416 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" event={"ID":"1df48642-0cb9-4242-ae92-16b239115170","Type":"ContainerStarted","Data":"96683b71050d117ee8ad13ce0ae940496549a206d325d504ab0af76a45f881f7"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517429 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517439 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerStarted","Data":"b36612e805e28ae8902021d1d0300be7876973ace2dc30015657a43e1a1a97ef"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517450 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerStarted","Data":"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517496 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517510 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerStarted","Data":"63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r9m95" event={"ID":"956bdf0f-108e-4966-bbb6-77be23255cec","Type":"ContainerStarted","Data":"756673614d2902e6a381b31700f9a1ac56ab898d15ecff5022830f75378c833b"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517536 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r9m95" event={"ID":"956bdf0f-108e-4966-bbb6-77be23255cec","Type":"ContainerStarted","Data":"fb91cd0fb7c4d8b171636ad72da91302e616bee195a11f3c8f1a5c8c1902137e"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517547 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" event={"ID":"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d","Type":"ContainerStarted","Data":"6650f4000967cdc6ada4b19f88ec6b9715afcbdced999fcdd492d510fe5cf68b"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerDied","Data":"461f3c798ef2ec6334ea60d55994e80867f8d705a04bf8eecd05f5e88be89c4e"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" event={"ID":"db8adc8a-118c-485d-b752-c3d0a2888458","Type":"ContainerStarted","Data":"f5ca3c51b73b4beb661b006f96edfb94984f0c9745405482264fade592c171ab"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517585 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" event={"ID":"db8adc8a-118c-485d-b752-c3d0a2888458","Type":"ContainerStarted","Data":"83f2acf12429f975ed1f532b946811b0fc7dee767180231b6f6f582cce037c72"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517595 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" event={"ID":"db8adc8a-118c-485d-b752-c3d0a2888458","Type":"ContainerStarted","Data":"232abda2c862707711617c9c81505a0449f93367cd5eeae1187998aebd758d6a"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.519555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9pnb\" (UniqueName: \"kubernetes.io/projected/47e6217b-24f0-495d-8c8e-cb684083e8dd-kube-api-access-p9pnb\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.524682 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" event={"ID":"7cab68e6-3113-4536-b5c3-2293265c9502","Type":"ContainerStarted","Data":"388df883c464ffdffeef6f390a80ef8d644a2dddb924b82d9d055aea32f91719"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.524750 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" event={"ID":"7cab68e6-3113-4536-b5c3-2293265c9502","Type":"ContainerStarted","Data":"4310ad87f99a35e18e06c9727f3648526226a191c0798ed5c5189f02060bf3c8"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.542544 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fq5\" (UniqueName: \"kubernetes.io/projected/39fc27d8-358f-40a9-8128-35b2e8f96f5c-kube-api-access-t2fq5\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.548909 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnnw2\" (UniqueName: \"kubernetes.io/projected/4718399b-22db-4443-bc52-22e461891f11-kube-api-access-mnnw2\") pod \"migrator-59844c95c7-97wdt\" (UID: \"4718399b-22db-4443-bc52-22e461891f11\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.562485 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.580061 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.588605 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.589570 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.089543981 +0000 UTC m=+147.395321005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.603932 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcp4\" (UniqueName: \"kubernetes.io/projected/f6821801-4d19-46b9-8a38-e55810cb2dbf-kube-api-access-dtcp4\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.614591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vcnl\" (UniqueName: \"kubernetes.io/projected/75b25a91-05cf-44c9-b656-c8a07515d84f-kube-api-access-5vcnl\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.615286 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.629146 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.634716 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psmsg\" (UniqueName: \"kubernetes.io/projected/7bfd245d-eb59-40d5-b1cf-517beaa46f32-kube-api-access-psmsg\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.638018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfhft\" (UniqueName: \"kubernetes.io/projected/30b3f92f-3bf5-448c-9542-6217fb51f239-kube-api-access-sfhft\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.647106 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.652072 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkh6\" (UniqueName: \"kubernetes.io/projected/31004ea4-e1fa-489e-a44e-701f370c9899-kube-api-access-4bkh6\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.665292 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.668743 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.692343 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.692952 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.19292755 +0000 UTC m=+147.498704434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.696066 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c89zl\" (UniqueName: \"kubernetes.io/projected/995e8bee-a3aa-466d-8007-4d12eab0d045-kube-api-access-c89zl\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.717948 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.719528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wp9h\" (UniqueName: \"kubernetes.io/projected/697daa16-5938-4757-a941-83fa9dbb019b-kube-api-access-6wp9h\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.733427 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.746682 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khrgg\" (UniqueName: \"kubernetes.io/projected/9e947476-a4bc-441e-97ab-2caba294339b-kube-api-access-khrgg\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.758299 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.781841 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.782189 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.800076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.800903 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.809026 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.309011758 +0000 UTC m=+147.614788642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.860156 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.881151 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.894264 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.905029 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.906848 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.909921 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.409902871 +0000 UTC m=+147.715679755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.937078 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.939270 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" podStartSLOduration=126.939253157 podStartE2EDuration="2m6.939253157s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:28.93440256 +0000 UTC m=+147.240179444" watchObservedRunningTime="2026-01-27 00:09:28.939253157 +0000 UTC m=+147.245030041" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.012535 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.012828 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.512816946 +0000 UTC m=+147.818593830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.113972 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.114935 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.614911265 +0000 UTC m=+147.920688159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.199888 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.216477 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.216906 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.716891931 +0000 UTC m=+148.022668815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.230035 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nt6rp"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.233376 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.317790 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.320235 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.320807 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.820762196 +0000 UTC m=+148.126539080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.321044 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.321968 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.821955611 +0000 UTC m=+148.127732495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.340345 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" podStartSLOduration=127.340323763 podStartE2EDuration="2m7.340323763s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.339612911 +0000 UTC m=+147.645389805" watchObservedRunningTime="2026-01-27 00:09:29.340323763 +0000 UTC m=+147.646100657" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.371246 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.373281 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.423145 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.423823 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.923784793 +0000 UTC m=+148.229561677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.429834 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-r9m95" podStartSLOduration=127.429812457 podStartE2EDuration="2m7.429812457s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.429534929 +0000 UTC m=+147.735311813" watchObservedRunningTime="2026-01-27 00:09:29.429812457 +0000 UTC m=+147.735589351" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.465628 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.496724 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-776sg"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.526089 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.526479 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.02646238 +0000 UTC m=+148.332239254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.591917 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" event={"ID":"86bc63c8-36a5-4c7a-bf5d-c8f93744a251","Type":"ContainerStarted","Data":"644c4861c08ce86fe753e4bd8cd17c371bbebaa8fe5ede236b2b6fcdff137bce"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.594281 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.598344 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" event={"ID":"05b20433-569d-4f1d-acd8-127119a934e1","Type":"ContainerStarted","Data":"e6d3f7de559282c231a4c916b4f48e91e390980192259b8dbf4a4779fe3268e6"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.627712 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.628098 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.128081836 +0000 UTC m=+148.433858710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.628437 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerStarted","Data":"0a57d0fb8a5602d47a075a2fd2b2312032aedb31af42a2e20363da20acb2c7e5"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.635270 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" event={"ID":"7e83c76a-d44d-4a1b-b904-857dad56b5ba","Type":"ContainerStarted","Data":"eee927cec084ec215de7c85deb5cb412ce92a14eed54c1852a7c707bfd2a57b6"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.650006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" event={"ID":"ec68f29b-2b30-4dff-b875-7617466be51b","Type":"ContainerStarted","Data":"9829d5d366c15f8a211861ade046b712e6de98c79ae44758cbcfcef44c84b8d1"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.650071 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" event={"ID":"ec68f29b-2b30-4dff-b875-7617466be51b","Type":"ContainerStarted","Data":"094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.654352 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" event={"ID":"ffb07ef8-54f3-448a-90ac-3da3e0d686ae","Type":"ContainerStarted","Data":"26cb23b34a6f56166c5bf2249e54d83f725ea2e7cf7b0f1467fbf6422bae9b61"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.657896 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" event={"ID":"8c9ce6bb-224d-498c-9299-762d84b0eaa3","Type":"ContainerStarted","Data":"a42df0ff4fd1f9cfef43be24bbd8168ac2a4b616d00beadad8b9eb3134007fb5"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.658893 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerStarted","Data":"6c6a9cbe32487a19776f84a95c618edb7f8a9a884ffdc154828b802d8fe8a819"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.666603 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nt6rp" event={"ID":"475a4aef-33e7-40ad-b7c4-20a10efa4ec3","Type":"ContainerStarted","Data":"c3306b03c14de85dd13d3daee11acc3abe88542bfecef256ec380233752ab420"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.675072 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerStarted","Data":"0291aad17e7875c8c8bae7dffadc2698234bc0f4a826218ffa06fe13e248194a"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.680696 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m59xl"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.689439 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" event={"ID":"e6f70216-3d55-4dd8-81f7-f2129a277407","Type":"ContainerStarted","Data":"428513d6e1f3ea985e70ec211b62da2cc65ea52ed5f48867223a9603cf158e70"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.711430 4774 csr.go:261] certificate signing request csr-26lcx is approved, waiting to be issued Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.725100 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" event={"ID":"459b67d7-cbf3-4890-a9a2-3e143ee459aa","Type":"ContainerStarted","Data":"009aae102f3825eab8a6d060ac3aeb4bbb12d9d6f4d21d29c5d213945d39be76"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.725153 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" event={"ID":"459b67d7-cbf3-4890-a9a2-3e143ee459aa","Type":"ContainerStarted","Data":"d217ac2925ef3ec95a52417166140dac43c7082f20dbdab3f1b0abf42a7a717e"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.725945 4774 csr.go:257] certificate signing request csr-26lcx is issued Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.730650 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-44dpr" event={"ID":"607f973b-fc78-4c11-bc09-fdbbe414d8c7","Type":"ContainerStarted","Data":"31c87a87bbb038977e219e5717d5635ed2ae9ac45791250d6a6f3fd630ae34ff"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.730700 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-44dpr" event={"ID":"607f973b-fc78-4c11-bc09-fdbbe414d8c7","Type":"ContainerStarted","Data":"9d72c9f2751184146cfe5ecec6413cd81a71ffdab0d5232d298c577bcd7c0051"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.739235 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.742166 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" podStartSLOduration=127.742144732 podStartE2EDuration="2m7.742144732s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.707203574 +0000 UTC m=+148.012980458" watchObservedRunningTime="2026-01-27 00:09:29.742144732 +0000 UTC m=+148.047921616" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.743343 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.243320377 +0000 UTC m=+148.549097261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.794973 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.842477 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.842715 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.342675773 +0000 UTC m=+148.648452657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.843325 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.845725 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.345705915 +0000 UTC m=+148.651482799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.888087 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" podStartSLOduration=127.88805557 podStartE2EDuration="2m7.88805557s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.821249808 +0000 UTC m=+148.127026692" watchObservedRunningTime="2026-01-27 00:09:29.88805557 +0000 UTC m=+148.193832454" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.918125 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.952016 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.952684 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.452665374 +0000 UTC m=+148.758442258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.053984 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.054456 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.554436834 +0000 UTC m=+148.860213738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.065357 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" podStartSLOduration=128.065334006 podStartE2EDuration="2m8.065334006s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.030138971 +0000 UTC m=+148.335915855" watchObservedRunningTime="2026-01-27 00:09:30.065334006 +0000 UTC m=+148.371110890" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.158877 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.159301 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.659279577 +0000 UTC m=+148.965056461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: W0127 00:09:30.188125 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea5ac78_6111_4f12_a2db_3a6f4e400d39.slice/crio-7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa WatchSource:0}: Error finding container 7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa: Status 404 returned error can't find the container with id 7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.218404 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.260275 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.260327 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.260396 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.262509 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.762486141 +0000 UTC m=+149.068263035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.263989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.290917 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.307082 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:30 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:30 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:30 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.307151 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.361718 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.362111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.362172 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.365114 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.865063635 +0000 UTC m=+149.170840519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.406218 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.429775 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.437617 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.465221 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.465623 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.965608848 +0000 UTC m=+149.271385732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.490732 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" podStartSLOduration=128.490708555 podStartE2EDuration="2m8.490708555s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.464148873 +0000 UTC m=+148.769925767" watchObservedRunningTime="2026-01-27 00:09:30.490708555 +0000 UTC m=+148.796485439" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.568182 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.568747 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.068719589 +0000 UTC m=+149.374496473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.584510 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" podStartSLOduration=128.58449293 podStartE2EDuration="2m8.58449293s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.582396467 +0000 UTC m=+148.888173351" watchObservedRunningTime="2026-01-27 00:09:30.58449293 +0000 UTC m=+148.890269814" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.627128 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw"] Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.672700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.673075 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.173062287 +0000 UTC m=+149.478839171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.685257 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.709759 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-44dpr" podStartSLOduration=128.709739918 podStartE2EDuration="2m8.709739918s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.672418307 +0000 UTC m=+148.978195191" watchObservedRunningTime="2026-01-27 00:09:30.709739918 +0000 UTC m=+149.015516802" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.725220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.730371 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-27 00:04:29 +0000 UTC, rotation deadline is 2026-11-04 01:54:12.151842091 +0000 UTC Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.730411 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6745h44m41.421433422s for next certificate rotation Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.750400 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" podStartSLOduration=128.75037835 podStartE2EDuration="2m8.75037835s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.74611992 +0000 UTC m=+149.051896804" watchObservedRunningTime="2026-01-27 00:09:30.75037835 +0000 UTC m=+149.056155234" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.758721 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" podStartSLOduration=128.758689694 podStartE2EDuration="2m8.758689694s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.710394108 +0000 UTC m=+149.016170992" watchObservedRunningTime="2026-01-27 00:09:30.758689694 +0000 UTC m=+149.064466588" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.773457 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.775172 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.275151536 +0000 UTC m=+149.580928420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.847932 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" event={"ID":"9ea5ac78-6111-4f12-a2db-3a6f4e400d39","Type":"ContainerStarted","Data":"7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.866751 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" podStartSLOduration=128.866728855 podStartE2EDuration="2m8.866728855s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.8642864 +0000 UTC m=+149.170063284" watchObservedRunningTime="2026-01-27 00:09:30.866728855 +0000 UTC m=+149.172505739" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.867178 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29491200-4blsg" podStartSLOduration=128.867172268 podStartE2EDuration="2m8.867172268s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.8083056 +0000 UTC m=+149.114082514" watchObservedRunningTime="2026-01-27 00:09:30.867172268 +0000 UTC m=+149.172949152" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.899515 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-776sg" event={"ID":"193a08ea-86ea-4176-898e-6a2c476ea6e9","Type":"ContainerStarted","Data":"76c579a20c933610cbedd0ccf1a09d14df4cfd53b16f63aad173c5564069f9ba"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.900998 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.901335 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.401325191 +0000 UTC m=+149.707102075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.913214 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pjtrg" event={"ID":"30b3f92f-3bf5-448c-9542-6217fb51f239","Type":"ContainerStarted","Data":"a9c15889495054b6278cabf6b04a47f7ab566759beeb475114b78141e6a616fd"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.915521 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" event={"ID":"35026702-de7c-4f5f-8714-f1d7f89adae6","Type":"ContainerStarted","Data":"fcfeb7938157beba9e8fd3ea370ef1873a89ecd03a28947a3e88448549036f32"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.933742 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" event={"ID":"8c9ce6bb-224d-498c-9299-762d84b0eaa3","Type":"ContainerStarted","Data":"28b5da5a446fe8b0be06271f31998faa4f7613cd88185a5550982f6d5631f536"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.940435 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z5sz"] Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.952586 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" event={"ID":"dcd12b52-828d-4244-99e9-2291f3a0bbb6","Type":"ContainerStarted","Data":"071506ac8e112236d78c54daccebafd4d82846318aa8cd3fac2efdaffc03e7fc"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.952635 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"ef636c2e64d05f81ed5455b2e615257d8560a2f754d13edaafaea28c35ce7218"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.952648 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" event={"ID":"5be86044-916e-44ba-9855-360b0ae2471a","Type":"ContainerStarted","Data":"0e425d6385bd827f0a6fa6c5fa9778a63a3d93117d41e2eb9dd4b0493eda1941"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.982077 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" podStartSLOduration=128.982045238 podStartE2EDuration="2m8.982045238s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.978086637 +0000 UTC m=+149.283863521" watchObservedRunningTime="2026-01-27 00:09:30.982045238 +0000 UTC m=+149.287822142" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.989948 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.001831 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.002310 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.502293108 +0000 UTC m=+149.808069992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.105887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.108312 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.608295226 +0000 UTC m=+149.914072110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.207134 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.208593 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.70857169 +0000 UTC m=+150.014348574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.208698 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.209328 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.709318463 +0000 UTC m=+150.015095337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.245903 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:31 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:31 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:31 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.253518 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.310839 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.311535 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.811517646 +0000 UTC m=+150.117294520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.368077 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" podStartSLOduration=129.368054134 podStartE2EDuration="2m9.368054134s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:31.001282216 +0000 UTC m=+149.307059100" watchObservedRunningTime="2026-01-27 00:09:31.368054134 +0000 UTC m=+149.673831018" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.372943 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.413783 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.414124 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.914110161 +0000 UTC m=+150.219887055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: W0127 00:09:31.432933 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4718399b_22db_4443_bc52_22e461891f11.slice/crio-d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57 WatchSource:0}: Error finding container d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57: Status 404 returned error can't find the container with id d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57 Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.517711 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.519030 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.019013007 +0000 UTC m=+150.324789891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.545480 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ttgpc"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.567192 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.587264 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.601497 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.620148 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.620870 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.120843488 +0000 UTC m=+150.426620372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: W0127 00:09:31.648071 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e947476_a4bc_441e_97ab_2caba294339b.slice/crio-b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303 WatchSource:0}: Error finding container b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303: Status 404 returned error can't find the container with id b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303 Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.723260 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.723565 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.223531696 +0000 UTC m=+150.529308620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.762845 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.764690 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.786345 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.797456 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.811901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.825946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.826292 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.326278015 +0000 UTC m=+150.632054899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.833677 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.856152 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-th968"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.901442 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8kqjp"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.926844 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.927134 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.427110536 +0000 UTC m=+150.732887420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.927524 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.927889 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.42787397 +0000 UTC m=+150.733650854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.932850 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.983279 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th968" event={"ID":"995e8bee-a3aa-466d-8007-4d12eab0d045","Type":"ContainerStarted","Data":"253dc5936ce6f313e58f919994fd42f5addeb0a09d3a3889e2238b4b73343f75"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.003259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" event={"ID":"75b25a91-05cf-44c9-b656-c8a07515d84f","Type":"ContainerStarted","Data":"5667fec29144ced5f25dd25bd78d8faf881c9c538dc4f6142b5ef1049cc27559"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.003342 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" event={"ID":"75b25a91-05cf-44c9-b656-c8a07515d84f","Type":"ContainerStarted","Data":"5d53a47b6727ca457243733c5afb6f47f1e0656cee8b6ed1c9aaadb22b92c794"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.004940 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.011123 4774 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4txdd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.011214 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" podUID="75b25a91-05cf-44c9-b656-c8a07515d84f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.025092 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" event={"ID":"9ea5ac78-6111-4f12-a2db-3a6f4e400d39","Type":"ContainerStarted","Data":"3f0374186addd2423546b11bdbacd76d8615c4e447b7c172f220d14c48166675"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.029544 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.029983 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.529965169 +0000 UTC m=+150.835742053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.034726 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" podStartSLOduration=130.034644542 podStartE2EDuration="2m10.034644542s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.03099351 +0000 UTC m=+150.336770394" watchObservedRunningTime="2026-01-27 00:09:32.034644542 +0000 UTC m=+150.340421426" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.035142 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3571959ac4a5cb62d895465a103b5a5233d7721540e7f3d218cd9aec19eac2fd"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.044514 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" event={"ID":"9e947476-a4bc-441e-97ab-2caba294339b","Type":"ContainerStarted","Data":"b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.052319 4774 generic.go:334] "Generic (PLEG): container finished" podID="e6f70216-3d55-4dd8-81f7-f2129a277407" containerID="891ae5323767ab807707955951556a8047d50847bb6a7c6c3865b62a00f99e38" exitCode=0 Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.052425 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" event={"ID":"e6f70216-3d55-4dd8-81f7-f2129a277407","Type":"ContainerDied","Data":"891ae5323767ab807707955951556a8047d50847bb6a7c6c3865b62a00f99e38"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.059548 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" podStartSLOduration=130.059505722 podStartE2EDuration="2m10.059505722s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.054222441 +0000 UTC m=+150.359999325" watchObservedRunningTime="2026-01-27 00:09:32.059505722 +0000 UTC m=+150.365282606" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.085349 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" event={"ID":"697daa16-5938-4757-a941-83fa9dbb019b","Type":"ContainerStarted","Data":"a927f23907f42e1cbcad0ee3147701d27155472a79bd400df20e0ba7d71a81c1"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.104403 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"3ddc87762ac7bc7162f2460dc71b0e9912f687657344ce7c680463a350a9d5f5"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.114950 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" event={"ID":"47e6217b-24f0-495d-8c8e-cb684083e8dd","Type":"ContainerStarted","Data":"7455dfbc1045f78c76a5a76da0693c0c226b541b9c267d62cebb81b21e93b9a1"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.132145 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.134293 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.634275517 +0000 UTC m=+150.940052401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.134598 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" event={"ID":"5be86044-916e-44ba-9855-360b0ae2471a","Type":"ContainerStarted","Data":"c2257101fa8cb5a675a5ab94b7d961a6634b3f8cb8296455232bcdda9f55f1ba"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.139707 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" event={"ID":"dcd12b52-828d-4244-99e9-2291f3a0bbb6","Type":"ContainerStarted","Data":"2510ce46ee90a783abedcd5bc9344ef584d36b4177a94213a199cca876f9dff7"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.153065 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerStarted","Data":"eb26eb479e0175ce267ac71c1b6249f0846c485d772ab1577426c90f5324ce03"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.175490 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" event={"ID":"4718399b-22db-4443-bc52-22e461891f11","Type":"ContainerStarted","Data":"d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.196365 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" event={"ID":"337a9389-b527-41a3-aba6-fada15fb648c","Type":"ContainerStarted","Data":"a99eea7bcb77baeb35c5dc052554c1290e8eec8cc836d4b0fc5e5531efd66db7"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.196423 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" event={"ID":"337a9389-b527-41a3-aba6-fada15fb648c","Type":"ContainerStarted","Data":"3d7c142fabd8a7c17405f161de3a672d2f489778ee7da90531cdd489c533d11c"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.207748 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nt6rp" event={"ID":"475a4aef-33e7-40ad-b7c4-20a10efa4ec3","Type":"ContainerStarted","Data":"5901298c40847fba572d02477eba5ed986950dbd27fc2f5646d651b5af5be153"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.209221 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.218115 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.218198 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.224730 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:32 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:32 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:32 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.224792 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.231841 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" event={"ID":"31004ea4-e1fa-489e-a44e-701f370c9899","Type":"ContainerStarted","Data":"ba80ccf411fcdfbf653add4e741aff9d9ad99581931e0786a6e518c6f10a4d64"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.232197 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" podStartSLOduration=130.232173888 podStartE2EDuration="2m10.232173888s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.23190445 +0000 UTC m=+150.537681354" watchObservedRunningTime="2026-01-27 00:09:32.232173888 +0000 UTC m=+150.537950762" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.233408 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.234544 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.73452775 +0000 UTC m=+151.040304624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.237517 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" podStartSLOduration=130.237504681 podStartE2EDuration="2m10.237504681s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.165688156 +0000 UTC m=+150.471465050" watchObservedRunningTime="2026-01-27 00:09:32.237504681 +0000 UTC m=+150.543281565" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.274411 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-776sg" event={"ID":"193a08ea-86ea-4176-898e-6a2c476ea6e9","Type":"ContainerStarted","Data":"bfd7ace510f6eb2c92ab1cedcac676d157af0bdab38fdfc57c7b84c3760dbd02"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.290559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ttgpc" event={"ID":"39fc27d8-358f-40a9-8128-35b2e8f96f5c","Type":"ContainerStarted","Data":"44bd72fd9140eea0820a67809412aa6f424d95f0545367674cbaa599f66dc752"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.294709 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" event={"ID":"959cd32e-9b14-4df3-aadf-d51b5a5a3c14","Type":"ContainerStarted","Data":"1cb130ec19ed17fb84b3d80eb2d469834c4b9bdd4b7044a29315aed07866d003"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.295992 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.302431 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerStarted","Data":"aa515064c5269cbfe12aef822a7cd645b59d236ad188742d6b364f88aa50ad05"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.304970 4774 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4btn7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.305023 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" podUID="959cd32e-9b14-4df3-aadf-d51b5a5a3c14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.315746 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nt6rp" podStartSLOduration=130.315723162 podStartE2EDuration="2m10.315723162s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.275366148 +0000 UTC m=+150.581143052" watchObservedRunningTime="2026-01-27 00:09:32.315723162 +0000 UTC m=+150.621500046" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.315875 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" event={"ID":"05b20433-569d-4f1d-acd8-127119a934e1","Type":"ContainerStarted","Data":"e625750d0dcf84e9ce73986d0f5c3c2f42aee3261cf5c6b2cf7f5bc93a823f2f"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.347375 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.350105 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.850084991 +0000 UTC m=+151.155861875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.355708 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"65f41b703ac2a3495a5997805adbdeb5fea07a418c84c30891544bb7306d7cb9"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.374966 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" podStartSLOduration=130.374947861 podStartE2EDuration="2m10.374947861s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.314085491 +0000 UTC m=+150.619862375" watchObservedRunningTime="2026-01-27 00:09:32.374947861 +0000 UTC m=+150.680724745" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.394788 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" event={"ID":"35026702-de7c-4f5f-8714-f1d7f89adae6","Type":"ContainerStarted","Data":"9d3b54aefefce63346f1f00ddc4f46e09cd0df1268170f92b232f34fa16f2114"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.442417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pjtrg" event={"ID":"30b3f92f-3bf5-448c-9542-6217fb51f239","Type":"ContainerStarted","Data":"7529f2ff9ea726ce2e976d4f7124e5201c4cf044a54240f341fa33409a178046"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.456737 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ttgpc" podStartSLOduration=7.456709199 podStartE2EDuration="7.456709199s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.394275752 +0000 UTC m=+150.700052626" watchObservedRunningTime="2026-01-27 00:09:32.456709199 +0000 UTC m=+150.762486073" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.457151 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.458606 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.958564726 +0000 UTC m=+151.264341610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.459677 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-776sg" podStartSLOduration=130.459669369 podStartE2EDuration="2m10.459669369s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.45640106 +0000 UTC m=+150.762177964" watchObservedRunningTime="2026-01-27 00:09:32.459669369 +0000 UTC m=+150.765446253" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.473874 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerStarted","Data":"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.473926 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.492496 4774 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-brc4j container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.492538 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.512730 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerStarted","Data":"68dbadf6774272234ad63109a60f8646d78e5ffd2b9aca24a0498e0100dbeb4f"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.583589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" event={"ID":"7e83c76a-d44d-4a1b-b904-857dad56b5ba","Type":"ContainerStarted","Data":"8c92e747083941367d7da529da515147cb53c57429877dad5243cdb4b11c8b39"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.586760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.591598 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxtrq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.591658 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.594202 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.595931 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.095912333 +0000 UTC m=+151.401689207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.604513 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.624571 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" podStartSLOduration=130.624547638 podStartE2EDuration="2m10.624547638s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.509476542 +0000 UTC m=+150.815253446" watchObservedRunningTime="2026-01-27 00:09:32.624547638 +0000 UTC m=+150.930324522" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.627467 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pjtrg" podStartSLOduration=7.627453726 podStartE2EDuration="7.627453726s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.623578518 +0000 UTC m=+150.929355402" watchObservedRunningTime="2026-01-27 00:09:32.627453726 +0000 UTC m=+150.933230680" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.668302 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" podStartSLOduration=130.668280684 podStartE2EDuration="2m10.668280684s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.666458628 +0000 UTC m=+150.972235502" watchObservedRunningTime="2026-01-27 00:09:32.668280684 +0000 UTC m=+150.974057568" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.690048 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.692204 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.192176374 +0000 UTC m=+151.497953258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.793731 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.794607 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.294591544 +0000 UTC m=+151.600368428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.895338 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.895779 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.395745575 +0000 UTC m=+151.701522459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.963431 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" podStartSLOduration=130.963409822 podStartE2EDuration="2m10.963409822s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.963115043 +0000 UTC m=+151.268891917" watchObservedRunningTime="2026-01-27 00:09:32.963409822 +0000 UTC m=+151.269186716" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.002412 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.002786 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.502772775 +0000 UTC m=+151.808549659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.103649 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.104350 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.604333509 +0000 UTC m=+151.910110393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.176035 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podStartSLOduration=131.176016509 podStartE2EDuration="2m11.176016509s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.170046877 +0000 UTC m=+151.475823761" watchObservedRunningTime="2026-01-27 00:09:33.176016509 +0000 UTC m=+151.481793393" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.206723 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.207226 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.707210372 +0000 UTC m=+152.012987256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.238480 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:33 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:33 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:33 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.238901 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.254088 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" podStartSLOduration=131.254069214 podStartE2EDuration="2m11.254069214s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.213227746 +0000 UTC m=+151.519004650" watchObservedRunningTime="2026-01-27 00:09:33.254069214 +0000 UTC m=+151.559846098" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.309634 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.310049 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.810024494 +0000 UTC m=+152.115801378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.411154 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.411514 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.911502684 +0000 UTC m=+152.217279558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.512983 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.513560 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.013534082 +0000 UTC m=+152.319310956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.594171 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" event={"ID":"4718399b-22db-4443-bc52-22e461891f11","Type":"ContainerStarted","Data":"90268d198bd720c510a581bccf515d612203e072791ede054c21f802099373fb"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.594230 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" event={"ID":"4718399b-22db-4443-bc52-22e461891f11","Type":"ContainerStarted","Data":"dcb8c6d9a33792d72682deba4d220be623f5732b9d06b099a9fae485f8604fc4"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.603142 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f5ce666c52b75c7420f9e32dbd1584b55dc2456393c63ed931418be0f3d1bf5"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.614588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.615232 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.115201519 +0000 UTC m=+152.420978403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.616928 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" event={"ID":"9e947476-a4bc-441e-97ab-2caba294339b","Type":"ContainerStarted","Data":"e9fa4fe7cd9c2b10ed164729f43ce502eda4c35bfb6728f59fb5a61c815aa51e"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.619312 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" event={"ID":"697daa16-5938-4757-a941-83fa9dbb019b","Type":"ContainerStarted","Data":"ed16abebccd99073522a96ad212314cb4715dd7d871bd9b74043748576d1fab7"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.619376 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" event={"ID":"697daa16-5938-4757-a941-83fa9dbb019b","Type":"ContainerStarted","Data":"60fe548956b49516d76d1c1c5e73d5fe0ecd648fd4872217d7312b565adf7b07"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.619966 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.624642 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" podStartSLOduration=131.624623196 podStartE2EDuration="2m11.624623196s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.620444829 +0000 UTC m=+151.926221723" watchObservedRunningTime="2026-01-27 00:09:33.624623196 +0000 UTC m=+151.930400090" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.637543 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" event={"ID":"35026702-de7c-4f5f-8714-f1d7f89adae6","Type":"ContainerStarted","Data":"fc01f558981d138381fa89374cb543543ad3bd9dfac543431e6e37bd66eea463"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.645116 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5c1d29767d919241d56d4e20a79f76e5963a04206a9f058c1bf0a40c5fe0a422"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.647525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ttgpc" event={"ID":"39fc27d8-358f-40a9-8128-35b2e8f96f5c","Type":"ContainerStarted","Data":"2fa6c7280f1a5a01dcabed8cc50c68ab8c2ea3aac5ac4b8ba3892907e703e744"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.655975 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" event={"ID":"959cd32e-9b14-4df3-aadf-d51b5a5a3c14","Type":"ContainerStarted","Data":"7f8b2006d21a8d987794dea335f80e8f035a231409be346a564596ab6d540fad"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.680121 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" podStartSLOduration=131.680099012 podStartE2EDuration="2m11.680099012s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.678607116 +0000 UTC m=+151.984384000" watchObservedRunningTime="2026-01-27 00:09:33.680099012 +0000 UTC m=+151.985875896" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.685269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" event={"ID":"47e6217b-24f0-495d-8c8e-cb684083e8dd","Type":"ContainerStarted","Data":"eb004ebc60b978a3dd9c2e58364338608b0e5ef4f1dc32050853635d443330f0"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.685319 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" event={"ID":"47e6217b-24f0-495d-8c8e-cb684083e8dd","Type":"ContainerStarted","Data":"b4484f6c66dce909af934f3050c30df56cf9d2c8b41dd9d7b2888fc73b5a8781"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.715460 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.716172 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" event={"ID":"e6f70216-3d55-4dd8-81f7-f2129a277407","Type":"ContainerStarted","Data":"13cbfdfefde4b2b846017257f33cf257e031fe7b8750958529770b5666a5ed71"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.716942 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.718143 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.218122304 +0000 UTC m=+152.523899188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.736219 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" event={"ID":"5be86044-916e-44ba-9855-360b0ae2471a","Type":"ContainerStarted","Data":"22252fd42dd59fdcbd10887e36eee230da096419894251a5af55826bf1cd5b2f"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.763190 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" podStartSLOduration=131.76317048 podStartE2EDuration="2m11.76317048s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.707362506 +0000 UTC m=+152.013139390" watchObservedRunningTime="2026-01-27 00:09:33.76317048 +0000 UTC m=+152.068947364" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.771732 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th968" event={"ID":"995e8bee-a3aa-466d-8007-4d12eab0d045","Type":"ContainerStarted","Data":"6cc72cf50a01a0eeaef3c15073d8754459c6aa3753a458b5640d227cceb423c7"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.772576 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-th968" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.811630 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerStarted","Data":"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.813159 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxtrq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.813197 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.818629 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.822428 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.322414671 +0000 UTC m=+152.628191555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.833992 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" podStartSLOduration=131.833979284 podStartE2EDuration="2m11.833979284s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.764624824 +0000 UTC m=+152.070401708" watchObservedRunningTime="2026-01-27 00:09:33.833979284 +0000 UTC m=+152.139756168" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.834771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" event={"ID":"f6821801-4d19-46b9-8a38-e55810cb2dbf","Type":"ContainerStarted","Data":"bc04a1b5d1f6a631f21b2e03f9ae6fc19d4a13350704db08c3fa6f680e42a44c"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.834806 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" event={"ID":"f6821801-4d19-46b9-8a38-e55810cb2dbf","Type":"ContainerStarted","Data":"4bd00e60402df038d82c41fef3a8f8ff6a9aa8f7ae2a65dfb5b950354d30c142"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.861210 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" event={"ID":"7bfd245d-eb59-40d5-b1cf-517beaa46f32","Type":"ContainerStarted","Data":"170df912a9c317355402d62276e4ed8b047cd56d4f5ceccef1caf073d1c7428e"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.861269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" event={"ID":"7bfd245d-eb59-40d5-b1cf-517beaa46f32","Type":"ContainerStarted","Data":"fe62ede7836f2766358f719f369947eea925d509f34f02a16d127d5c988f4ee3"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.862391 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.867724 4774 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bgm7n container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.867790 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" podUID="7bfd245d-eb59-40d5-b1cf-517beaa46f32" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.881022 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" event={"ID":"31004ea4-e1fa-489e-a44e-701f370c9899","Type":"ContainerStarted","Data":"5dd632505274246e91254dddec44b325983f5f143303d7eda0d2ca677d02b528"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.881176 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" event={"ID":"31004ea4-e1fa-489e-a44e-701f370c9899","Type":"ContainerStarted","Data":"075a631581eace9df7d190813bded1f6bd2c5e7d46d625e9fb54d10fff672a8f"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.898838 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5be0f25dbcdf09c743f9fb7856b6a292a71f820b8d2df02e3668d6d9308d0b35"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.899002 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2002ab4f0da77498d61b6a634d8112f92a113453f276047f008cac4e6c357042"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.899572 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.922585 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.923119 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.423086577 +0000 UTC m=+152.728863461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.925789 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" podStartSLOduration=131.925772108 podStartE2EDuration="2m11.925772108s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.92350836 +0000 UTC m=+152.229285244" watchObservedRunningTime="2026-01-27 00:09:33.925772108 +0000 UTC m=+152.231548992" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.927163 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerStarted","Data":"26a6e1695fd4493c96ea71a7b30ed1405b2549d6759d42ff50698cae6462af89"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.928634 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.928754 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.935298 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.937476 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.437459916 +0000 UTC m=+152.743236800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.948225 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.983202 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" podStartSLOduration=131.983181814 podStartE2EDuration="2m11.983181814s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.959368935 +0000 UTC m=+152.265145819" watchObservedRunningTime="2026-01-27 00:09:33.983181814 +0000 UTC m=+152.288958698" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.043521 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.045522 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.545499467 +0000 UTC m=+152.851276351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.146682 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.147120 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.647104573 +0000 UTC m=+152.952881457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.147921 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" podStartSLOduration=132.147909107 podStartE2EDuration="2m12.147909107s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.077266148 +0000 UTC m=+152.383043032" watchObservedRunningTime="2026-01-27 00:09:34.147909107 +0000 UTC m=+152.453685981" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.152289 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.154395 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" podStartSLOduration=132.154382075 podStartE2EDuration="2m12.154382075s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.148305718 +0000 UTC m=+152.454082622" watchObservedRunningTime="2026-01-27 00:09:34.154382075 +0000 UTC m=+152.460158959" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.214881 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-th968" podStartSLOduration=9.214845322 podStartE2EDuration="9.214845322s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.213550103 +0000 UTC m=+152.519327007" watchObservedRunningTime="2026-01-27 00:09:34.214845322 +0000 UTC m=+152.520622206" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.245170 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:34 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:34 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:34 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.245666 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.247671 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.248131 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.748112739 +0000 UTC m=+153.053889623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.283688 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" podStartSLOduration=132.283670085 podStartE2EDuration="2m12.283670085s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.281193029 +0000 UTC m=+152.586969913" watchObservedRunningTime="2026-01-27 00:09:34.283670085 +0000 UTC m=+152.589446969" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.321690 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" podStartSLOduration=132.321666586 podStartE2EDuration="2m12.321666586s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.320985256 +0000 UTC m=+152.626762150" watchObservedRunningTime="2026-01-27 00:09:34.321666586 +0000 UTC m=+152.627443470" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.350334 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.350648 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.850636831 +0000 UTC m=+153.156413715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.384405 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" podStartSLOduration=132.384384093 podStartE2EDuration="2m12.384384093s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.350080185 +0000 UTC m=+152.655857069" watchObservedRunningTime="2026-01-27 00:09:34.384384093 +0000 UTC m=+152.690160977" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.451088 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.451325 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.951307428 +0000 UTC m=+153.257084312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.451373 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.451739 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.95173123 +0000 UTC m=+153.257508114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.553058 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.553552 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.053520161 +0000 UTC m=+153.359297055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.654768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.655272 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.155248859 +0000 UTC m=+153.461025943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.658116 4774 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4btn7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.658166 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" podUID="959cd32e-9b14-4df3-aadf-d51b5a5a3c14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.756173 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.756279 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.256262366 +0000 UTC m=+153.562039250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.756500 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.757053 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.257025579 +0000 UTC m=+153.562802643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.857977 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.858180 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.358138939 +0000 UTC m=+153.663915823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.858506 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.858936 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.358917553 +0000 UTC m=+153.664694437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.933920 4774 generic.go:334] "Generic (PLEG): container finished" podID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerID="26a6e1695fd4493c96ea71a7b30ed1405b2549d6759d42ff50698cae6462af89" exitCode=0 Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.934017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerDied","Data":"26a6e1695fd4493c96ea71a7b30ed1405b2549d6759d42ff50698cae6462af89"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.937148 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"10e382312a24a32bbcfe09efa9541435c12ed84634265e44fa95274fe3b603f3"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.937178 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"aa29ac077703e9abe30d41513138058abefb3f449b063ad06344e09042cff927"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.937192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"ae0dd065d36d2ce9b1b8d1854ce10a779cd352216693596a92093376597a2f6a"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.940065 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th968" event={"ID":"995e8bee-a3aa-466d-8007-4d12eab0d045","Type":"ContainerStarted","Data":"0e06505747eee92ced1229f30d7d6a8cb57e1f02736feb715aa5c87b9ef545e9"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.946729 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.946772 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.946956 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxtrq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.947022 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.950584 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.959261 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.959628 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.459593079 +0000 UTC m=+153.765369963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.976098 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.982049 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" podStartSLOduration=9.982029345 podStartE2EDuration="9.982029345s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.979948531 +0000 UTC m=+153.285725415" watchObservedRunningTime="2026-01-27 00:09:34.982029345 +0000 UTC m=+153.287806219" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.063339 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.066957 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.56693525 +0000 UTC m=+153.872712134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.163610 4774 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.164657 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.168945 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.668917375 +0000 UTC m=+153.974694259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.175459 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.176647 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.676628791 +0000 UTC m=+153.982405675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.223341 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:35 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:35 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:35 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.223664 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.276463 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.277133 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.777103092 +0000 UTC m=+154.082879976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.277294 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.277666 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.777651128 +0000 UTC m=+154.083428012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.378583 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.378756 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.878729216 +0000 UTC m=+154.184506100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.379089 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.379424 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.879411178 +0000 UTC m=+154.185188062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.479942 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.480193 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.980154046 +0000 UTC m=+154.285930940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.480667 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.481170 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.981148706 +0000 UTC m=+154.286925590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.505313 4774 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-27T00:09:35.163646525Z","Handler":null,"Name":""} Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.512303 4774 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.512351 4774 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.581311 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.585305 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.682357 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.686028 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.686084 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.724512 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.992219 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.222529 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.224145 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.226297 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:36 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:36 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:36 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.226341 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.228399 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.229665 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.289097 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.290618 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.293644 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.296103 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.369598 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390691 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390780 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390816 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390836 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390892 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.424090 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.436912 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:09:36 crc kubenswrapper[4774]: W0127 00:09:36.447992 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf626623_28b8_43a3_a567_f14b1e95075a.slice/crio-8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf WatchSource:0}: Error finding container 8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf: Status 404 returned error can't find the container with id 8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492478 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492553 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492577 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492595 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492652 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492676 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493114 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493181 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493214 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: E0127 00:09:36.493466 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerName="collect-profiles" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493767 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493890 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerName="collect-profiles" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493977 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.494035 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerName="collect-profiles" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.494967 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.541912 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.545360 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.548773 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.556550 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.594256 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.597623 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.597705 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.599369 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b613d2c-46d4-461a-9414-d2ce3b1788bf" (UID: "0b613d2c-46d4-461a-9414-d2ce3b1788bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.600326 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.602664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b613d2c-46d4-461a-9414-d2ce3b1788bf" (UID: "0b613d2c-46d4-461a-9414-d2ce3b1788bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.603373 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p" (OuterVolumeSpecName: "kube-api-access-rp85p") pod "0b613d2c-46d4-461a-9414-d2ce3b1788bf" (UID: "0b613d2c-46d4-461a-9414-d2ce3b1788bf"). InnerVolumeSpecName "kube-api-access-rp85p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.643319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.675908 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.675984 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.687791 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.689163 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.697076 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.697283 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702270 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702312 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702356 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702464 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702519 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702593 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702614 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.706691 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.723378 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.802335 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803309 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803358 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803399 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803430 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803455 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803481 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.804498 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.804607 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.806741 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.807920 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.823726 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.824394 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.828538 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: W0127 00:09:36.855656 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cad24dd_acc4_40e5_8380_e0d74be79921.slice/crio-9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1 WatchSource:0}: Error finding container 9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1: Status 404 returned error can't find the container with id 9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1 Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.956266 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerStarted","Data":"9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.960519 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerDied","Data":"aa515064c5269cbfe12aef822a7cd645b59d236ad188742d6b364f88aa50ad05"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.960567 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa515064c5269cbfe12aef822a7cd645b59d236ad188742d6b364f88aa50ad05" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.960734 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.965039 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.972035 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerStarted","Data":"3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.972111 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerStarted","Data":"8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.974214 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.979746 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.986802 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.987789 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.995596 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.995911 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:36.998647 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" podStartSLOduration=134.998612375 podStartE2EDuration="2m14.998612375s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:36.996232401 +0000 UTC m=+155.302009305" watchObservedRunningTime="2026-01-27 00:09:36.998612375 +0000 UTC m=+155.304389279" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.019293 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.059270 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.109752 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.110083 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.115653 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.122768 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.211922 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.212429 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.212738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.226138 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:37 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:37 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:37 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.226212 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.238113 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.315608 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.330339 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:09:37 crc kubenswrapper[4774]: W0127 00:09:37.374172 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f2c4e4_f111_453e_b953_cdf2528f9a3e.slice/crio-fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76 WatchSource:0}: Error finding container fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76: Status 404 returned error can't find the container with id fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.539644 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.976811 4774 generic.go:334] "Generic (PLEG): container finished" podID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.976888 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979113 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979751 4774 generic.go:334] "Generic (PLEG): container finished" podID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979833 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979932 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerStarted","Data":"fafa96ca772c719431784a78d0a35959db893c145e408b85c8cbdfa6a0078917"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.982071 4774 generic.go:334] "Generic (PLEG): container finished" podID="af002227-deb6-4a24-8ce5-051f93bc178b" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.982163 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.982197 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerStarted","Data":"f9e3168af24b8a48492cc6713c61bfd5396824e8a18842b10c09f8a37f4eed28"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.984187 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d69e5e66-2cd3-4102-9126-0577e4442e7b","Type":"ContainerStarted","Data":"ef257a72263bb152bc90b02e175f4869660c82a437b852f0f0c82132e335ab9f"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.985887 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.985975 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.986017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerStarted","Data":"fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76"} Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.121657 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.122101 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.121829 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.122459 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.171293 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.171349 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.174009 4774 patch_prober.go:28] interesting pod/console-f9d7485db-776sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.174084 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-776sg" podUID="193a08ea-86ea-4176-898e-6a2c476ea6e9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.219093 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.225063 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:38 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:38 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:38 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.225146 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.287861 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.289075 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.291528 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.299844 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.432354 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.432416 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.432451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.533865 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534011 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534033 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534460 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534560 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.554433 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.611468 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.687846 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.689225 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.697699 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.839057 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.839467 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.839542 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.899946 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.918001 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.940539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941110 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941142 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941219 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941487 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.963409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.993531 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerStarted","Data":"7c895fbfa4a8556de4f63281aed06489f2f81ec44684ed968eb3615baa9f463f"} Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.010302 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d69e5e66-2cd3-4102-9126-0577e4442e7b","Type":"ContainerDied","Data":"5df84b7c9c635777323921eed7ce0bdc1ff6c29233389784fd92a34993ae9156"} Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.010260 4774 generic.go:334] "Generic (PLEG): container finished" podID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerID="5df84b7c9c635777323921eed7ce0bdc1ff6c29233389784fd92a34993ae9156" exitCode=0 Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.035372 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.227523 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:39 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:39 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:39 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.227951 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.289009 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.290074 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.292996 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.297284 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.357329 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.357378 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.357442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.459116 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.459639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.459668 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.460172 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.460386 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.481061 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.558084 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.652039 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.688109 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.689191 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.701183 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.764547 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.764604 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.764735 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.867232 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.867986 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.868022 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.870878 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.871982 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.883127 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.020850 4774 generic.go:334] "Generic (PLEG): container finished" podID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" exitCode=0 Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.020973 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532"} Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.024599 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.025324 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" exitCode=0 Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.025571 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731"} Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.025601 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerStarted","Data":"ce3bedf60ecff127b9cf54aca7f49498f367bae4df2ca706885feb215a762b62"} Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.192040 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.226333 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:40 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:40 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:40 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.226777 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.330360 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.331508 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.335814 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.337216 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.338867 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.379912 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.389851 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: W0127 00:09:40.421658 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e47ec75_abd8_41d4_a3c8_e722d21a305f.slice/crio-1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c WatchSource:0}: Error finding container 1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c: Status 404 returned error can't find the container with id 1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.475882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"d69e5e66-2cd3-4102-9126-0577e4442e7b\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.475921 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"d69e5e66-2cd3-4102-9126-0577e4442e7b\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.476229 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.476269 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.477289 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d69e5e66-2cd3-4102-9126-0577e4442e7b" (UID: "d69e5e66-2cd3-4102-9126-0577e4442e7b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.482544 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d69e5e66-2cd3-4102-9126-0577e4442e7b" (UID: "d69e5e66-2cd3-4102-9126-0577e4442e7b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.576958 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577101 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577542 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577555 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.597375 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.684723 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.061351 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d69e5e66-2cd3-4102-9126-0577e4442e7b","Type":"ContainerDied","Data":"ef257a72263bb152bc90b02e175f4869660c82a437b852f0f0c82132e335ab9f"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.061392 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef257a72263bb152bc90b02e175f4869660c82a437b852f0f0c82132e335ab9f" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.061368 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.064874 4774 generic.go:334] "Generic (PLEG): container finished" podID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" exitCode=0 Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.064966 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.065000 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerStarted","Data":"1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.069061 4774 generic.go:334] "Generic (PLEG): container finished" podID="386f196b-c4bc-4fea-924b-c0487a352310" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" exitCode=0 Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.069117 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.069146 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerStarted","Data":"8a4f7dc871614277fda5710d46e4863b99158a3773f029cd73a7cc991db5ffae"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.204665 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 00:09:41 crc kubenswrapper[4774]: W0127 00:09:41.217935 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcd05a6f8_6bd6_40f4_8a17_79fccebb7103.slice/crio-5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af WatchSource:0}: Error finding container 5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af: Status 404 returned error can't find the container with id 5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.223591 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.227686 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:42 crc kubenswrapper[4774]: I0127 00:09:42.082283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cd05a6f8-6bd6-40f4-8a17-79fccebb7103","Type":"ContainerStarted","Data":"5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af"} Jan 27 00:09:43 crc kubenswrapper[4774]: I0127 00:09:43.092142 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerID="2094c034871a99cde3fa9218cb44ce39e3caf7cba67f764090a2ef81441ea7ec" exitCode=0 Jan 27 00:09:43 crc kubenswrapper[4774]: I0127 00:09:43.092256 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cd05a6f8-6bd6-40f4-8a17-79fccebb7103","Type":"ContainerDied","Data":"2094c034871a99cde3fa9218cb44ce39e3caf7cba67f764090a2ef81441ea7ec"} Jan 27 00:09:43 crc kubenswrapper[4774]: I0127 00:09:43.785409 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-th968" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.319297 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.463110 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.463208 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.464380 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cd05a6f8-6bd6-40f4-8a17-79fccebb7103" (UID: "cd05a6f8-6bd6-40f4-8a17-79fccebb7103"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.469049 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cd05a6f8-6bd6-40f4-8a17-79fccebb7103" (UID: "cd05a6f8-6bd6-40f4-8a17-79fccebb7103"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.564840 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.565388 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:45 crc kubenswrapper[4774]: I0127 00:09:45.121897 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cd05a6f8-6bd6-40f4-8a17-79fccebb7103","Type":"ContainerDied","Data":"5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af"} Jan 27 00:09:45 crc kubenswrapper[4774]: I0127 00:09:45.121954 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af" Jan 27 00:09:45 crc kubenswrapper[4774]: I0127 00:09:45.122039 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:46 crc kubenswrapper[4774]: I0127 00:09:46.395057 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:46 crc kubenswrapper[4774]: I0127 00:09:46.403628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:46 crc kubenswrapper[4774]: I0127 00:09:46.683439 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:48 crc kubenswrapper[4774]: I0127 00:09:48.131081 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:48 crc kubenswrapper[4774]: I0127 00:09:48.171692 4774 patch_prober.go:28] interesting pod/console-f9d7485db-776sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Jan 27 00:09:48 crc kubenswrapper[4774]: I0127 00:09:48.171747 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-776sg" podUID="193a08ea-86ea-4176-898e-6a2c476ea6e9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Jan 27 00:09:56 crc kubenswrapper[4774]: I0127 00:09:56.002392 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:58 crc kubenswrapper[4774]: I0127 00:09:58.176038 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:58 crc kubenswrapper[4774]: I0127 00:09:58.179851 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:10:03 crc kubenswrapper[4774]: I0127 00:10:03.260985 4774 generic.go:334] "Generic (PLEG): container finished" podID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerID="0291aad17e7875c8c8bae7dffadc2698234bc0f4a826218ffa06fe13e248194a" exitCode=0 Jan 27 00:10:03 crc kubenswrapper[4774]: I0127 00:10:03.261051 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerDied","Data":"0291aad17e7875c8c8bae7dffadc2698234bc0f4a826218ffa06fe13e248194a"} Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.084108 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage120616833/1\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.084555 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8rvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mch4n_openshift-marketplace(6e47ec75-abd8-41d4-a3c8-e722d21a305f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage120616833/1\": happened during read: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.085736 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage120616833/1\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.116304 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.116466 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57w5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kb8hq_openshift-marketplace(f4f2c4e4-f111-453e-b953-cdf2528f9a3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.118066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kb8hq" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.176128 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.176725 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nvt8p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ngr5v_openshift-marketplace(5e621715-39b3-4094-b589-ede8b74b0e8f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.177910 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ngr5v" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.224248 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.224406 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g24zl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tlk6f_openshift-marketplace(b5f7850c-533c-48e5-bead-ddc0e7ba8d83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.225832 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tlk6f" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" Jan 27 00:10:06 crc kubenswrapper[4774]: I0127 00:10:06.675893 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:10:06 crc kubenswrapper[4774]: I0127 00:10:06.676429 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.833485 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ngr5v" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.834322 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kb8hq" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.834397 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.834481 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tlk6f" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" Jan 27 00:10:08 crc kubenswrapper[4774]: I0127 00:10:08.879197 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:10:08 crc kubenswrapper[4774]: I0127 00:10:08.932171 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.972263 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.972436 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vk44x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-97k6k_openshift-marketplace(386f196b-c4bc-4fea-924b-c0487a352310): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.973613 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.035036 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"518a161e-aeab-4ad6-a2c0-dee7ec963958\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.035608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"518a161e-aeab-4ad6-a2c0-dee7ec963958\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.036788 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca" (OuterVolumeSpecName: "serviceca") pod "518a161e-aeab-4ad6-a2c0-dee7ec963958" (UID: "518a161e-aeab-4ad6-a2c0-dee7ec963958"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.046013 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr" (OuterVolumeSpecName: "kube-api-access-8kbkr") pod "518a161e-aeab-4ad6-a2c0-dee7ec963958" (UID: "518a161e-aeab-4ad6-a2c0-dee7ec963958"). InnerVolumeSpecName "kube-api-access-8kbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.137101 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.137164 4774 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.296405 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerStarted","Data":"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0"} Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.302505 4774 generic.go:334] "Generic (PLEG): container finished" podID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" exitCode=0 Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.302652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724"} Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.306820 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerDied","Data":"b36612e805e28ae8902021d1d0300be7876973ace2dc30015657a43e1a1a97ef"} Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.306868 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b36612e805e28ae8902021d1d0300be7876973ace2dc30015657a43e1a1a97ef" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.306932 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.318363 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerStarted","Data":"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f"} Jan 27 00:10:09 crc kubenswrapper[4774]: E0127 00:10:09.319056 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.358288 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6djzf"] Jan 27 00:10:09 crc kubenswrapper[4774]: W0127 00:10:09.468250 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode639e1da_0d65_4d42_b1fc_23d5db91e9e6.slice/crio-2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530 WatchSource:0}: Error finding container 2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530: Status 404 returned error can't find the container with id 2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530 Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.326618 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerStarted","Data":"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.329030 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6djzf" event={"ID":"e639e1da-0d65-4d42-b1fc-23d5db91e9e6","Type":"ContainerStarted","Data":"847ee6772016091d4a0142e3d9b733102020545c4889f9bf358fc28dfe1d709a"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.329068 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6djzf" event={"ID":"e639e1da-0d65-4d42-b1fc-23d5db91e9e6","Type":"ContainerStarted","Data":"27e7305723b8b46d31d00cba48862e9225f2fbda4ec93f9fff46d97c268a5ffb"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.329111 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6djzf" event={"ID":"e639e1da-0d65-4d42-b1fc-23d5db91e9e6","Type":"ContainerStarted","Data":"2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.332052 4774 generic.go:334] "Generic (PLEG): container finished" podID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" exitCode=0 Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.332129 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.338454 4774 generic.go:334] "Generic (PLEG): container finished" podID="af002227-deb6-4a24-8ce5-051f93bc178b" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" exitCode=0 Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.338532 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.348909 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ds9fx" podStartSLOduration=2.336960469 podStartE2EDuration="32.348876189s" podCreationTimestamp="2026-01-27 00:09:38 +0000 UTC" firstStartedPulling="2026-01-27 00:09:40.023408462 +0000 UTC m=+158.329185336" lastFinishedPulling="2026-01-27 00:10:10.035324172 +0000 UTC m=+188.341101056" observedRunningTime="2026-01-27 00:10:10.346630689 +0000 UTC m=+188.652407583" watchObservedRunningTime="2026-01-27 00:10:10.348876189 +0000 UTC m=+188.654653083" Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.405938 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6djzf" podStartSLOduration=168.405912977 podStartE2EDuration="2m48.405912977s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:10.404229642 +0000 UTC m=+188.710006526" watchObservedRunningTime="2026-01-27 00:10:10.405912977 +0000 UTC m=+188.711689861" Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.693184 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.346251 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerStarted","Data":"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e"} Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.349979 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerStarted","Data":"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4"} Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.379135 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnlbr" podStartSLOduration=2.473852607 podStartE2EDuration="35.379107725s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.978904049 +0000 UTC m=+156.284680933" lastFinishedPulling="2026-01-27 00:10:10.884159167 +0000 UTC m=+189.189936051" observedRunningTime="2026-01-27 00:10:11.374972054 +0000 UTC m=+189.680748948" watchObservedRunningTime="2026-01-27 00:10:11.379107725 +0000 UTC m=+189.684884609" Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.434891 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pj24q" podStartSLOduration=2.610337687 podStartE2EDuration="35.434853259s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.983526191 +0000 UTC m=+156.289303075" lastFinishedPulling="2026-01-27 00:10:10.808041753 +0000 UTC m=+189.113818647" observedRunningTime="2026-01-27 00:10:11.430848241 +0000 UTC m=+189.736625125" watchObservedRunningTime="2026-01-27 00:10:11.434853259 +0000 UTC m=+189.740630143" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.718969 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 00:10:14 crc kubenswrapper[4774]: E0127 00:10:14.721574 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.721752 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: E0127 00:10:14.721901 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.721984 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: E0127 00:10:14.722111 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerName="image-pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722190 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerName="image-pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722400 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerName="image-pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722597 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722721 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.724047 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.727488 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.728124 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.728505 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.824942 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.825075 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.926166 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.926268 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.926673 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.955439 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:15 crc kubenswrapper[4774]: I0127 00:10:15.061207 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:15 crc kubenswrapper[4774]: I0127 00:10:15.492222 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 00:10:15 crc kubenswrapper[4774]: W0127 00:10:15.501735 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod15eaf928_73fe_4143_8de9_77c76dc4a990.slice/crio-e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a WatchSource:0}: Error finding container e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a: Status 404 returned error can't find the container with id e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.393634 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerStarted","Data":"f0aa1f85fd14124d38f2ae424a2c19c56e444540fb77f1c91838b8daf417fe57"} Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.394283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerStarted","Data":"e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a"} Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.414339 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.414318082 podStartE2EDuration="2.414318082s" podCreationTimestamp="2026-01-27 00:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:16.41015206 +0000 UTC m=+194.715928944" watchObservedRunningTime="2026-01-27 00:10:16.414318082 +0000 UTC m=+194.720094966" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.523724 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.549153 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.549229 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.644699 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.647413 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.882911 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.883563 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.400154 4774 generic.go:334] "Generic (PLEG): container finished" podID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerID="f0aa1f85fd14124d38f2ae424a2c19c56e444540fb77f1c91838b8daf417fe57" exitCode=0 Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.400208 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerDied","Data":"f0aa1f85fd14124d38f2ae424a2c19c56e444540fb77f1c91838b8daf417fe57"} Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.457360 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.458200 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.614422 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.615302 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.680928 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.760129 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.879563 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"15eaf928-73fe-4143-8de9-77c76dc4a990\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.879768 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"15eaf928-73fe-4143-8de9-77c76dc4a990\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.879848 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15eaf928-73fe-4143-8de9-77c76dc4a990" (UID: "15eaf928-73fe-4143-8de9-77c76dc4a990"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.880081 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.888497 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15eaf928-73fe-4143-8de9-77c76dc4a990" (UID: "15eaf928-73fe-4143-8de9-77c76dc4a990"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.982189 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.412592 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerDied","Data":"e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a"} Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.413110 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.412878 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.476882 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.525781 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 00:10:19 crc kubenswrapper[4774]: E0127 00:10:19.526133 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerName="pruner" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.526166 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerName="pruner" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.526280 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerName="pruner" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.526777 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.529467 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.529722 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.541977 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.592808 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.593296 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.593392 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.694764 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695844 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695810 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695645 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.715672 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.848967 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:20 crc kubenswrapper[4774]: I0127 00:10:20.274566 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 00:10:20 crc kubenswrapper[4774]: W0127 00:10:20.282559 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd05ddc41_daa1_4deb_a5bf_d5b5f9181a1c.slice/crio-ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa WatchSource:0}: Error finding container ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa: Status 404 returned error can't find the container with id ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa Jan 27 00:10:20 crc kubenswrapper[4774]: I0127 00:10:20.418058 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerStarted","Data":"ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa"} Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.433015 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerStarted","Data":"05e7a9c90f1cfc8896ce47e0dacfdae023353b22d601861957e2c1280d6e97a3"} Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.435084 4774 generic.go:334] "Generic (PLEG): container finished" podID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" exitCode=0 Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.435658 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917"} Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.455731 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.455709481 podStartE2EDuration="2.455709481s" podCreationTimestamp="2026-01-27 00:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:21.45255094 +0000 UTC m=+199.758327824" watchObservedRunningTime="2026-01-27 00:10:21.455709481 +0000 UTC m=+199.761486365" Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.447194 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerStarted","Data":"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29"} Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.449676 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerStarted","Data":"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92"} Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.451457 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerStarted","Data":"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c"} Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.453761 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerStarted","Data":"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.461667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerStarted","Data":"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.464342 4774 generic.go:334] "Generic (PLEG): container finished" podID="386f196b-c4bc-4fea-924b-c0487a352310" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" exitCode=0 Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.464417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.467002 4774 generic.go:334] "Generic (PLEG): container finished" podID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" exitCode=0 Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.467082 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.474310 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" exitCode=0 Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.474364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.484476 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ngr5v" podStartSLOduration=3.5101536380000002 podStartE2EDuration="48.484454948s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.98190041 +0000 UTC m=+156.287677294" lastFinishedPulling="2026-01-27 00:10:22.95620172 +0000 UTC m=+201.261978604" observedRunningTime="2026-01-27 00:10:23.544467321 +0000 UTC m=+201.850244205" watchObservedRunningTime="2026-01-27 00:10:24.484454948 +0000 UTC m=+202.790231832" Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.486286 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerStarted","Data":"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.490022 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerStarted","Data":"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.492476 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerStarted","Data":"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.494292 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" exitCode=0 Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.494346 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.509122 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-97k6k" podStartSLOduration=2.720155274 podStartE2EDuration="46.509109024s" podCreationTimestamp="2026-01-27 00:09:39 +0000 UTC" firstStartedPulling="2026-01-27 00:09:41.072964503 +0000 UTC m=+159.378741387" lastFinishedPulling="2026-01-27 00:10:24.861918253 +0000 UTC m=+203.167695137" observedRunningTime="2026-01-27 00:10:25.507945984 +0000 UTC m=+203.813722868" watchObservedRunningTime="2026-01-27 00:10:25.509109024 +0000 UTC m=+203.814885908" Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.527399 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mch4n" podStartSLOduration=2.705947161 podStartE2EDuration="46.527374661s" podCreationTimestamp="2026-01-27 00:09:39 +0000 UTC" firstStartedPulling="2026-01-27 00:09:41.067042783 +0000 UTC m=+159.372819667" lastFinishedPulling="2026-01-27 00:10:24.888470283 +0000 UTC m=+203.194247167" observedRunningTime="2026-01-27 00:10:25.525925194 +0000 UTC m=+203.831702078" watchObservedRunningTime="2026-01-27 00:10:25.527374661 +0000 UTC m=+203.833151545" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.504559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerStarted","Data":"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386"} Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.523750 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tlk6f" podStartSLOduration=2.341231838 podStartE2EDuration="48.523714842s" podCreationTimestamp="2026-01-27 00:09:38 +0000 UTC" firstStartedPulling="2026-01-27 00:09:40.027033463 +0000 UTC m=+158.332810347" lastFinishedPulling="2026-01-27 00:10:26.209516467 +0000 UTC m=+204.515293351" observedRunningTime="2026-01-27 00:10:26.523093085 +0000 UTC m=+204.828869969" watchObservedRunningTime="2026-01-27 00:10:26.523714842 +0000 UTC m=+204.829491726" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.529156 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kb8hq" podStartSLOduration=3.553429184 podStartE2EDuration="50.52914077s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.987272395 +0000 UTC m=+156.293049279" lastFinishedPulling="2026-01-27 00:10:24.962983981 +0000 UTC m=+203.268760865" observedRunningTime="2026-01-27 00:10:25.571151642 +0000 UTC m=+203.876928526" watchObservedRunningTime="2026-01-27 00:10:26.52914077 +0000 UTC m=+204.834917654" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.829343 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.829781 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.878489 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:27 crc kubenswrapper[4774]: I0127 00:10:27.060389 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:27 crc kubenswrapper[4774]: I0127 00:10:27.060462 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:27 crc kubenswrapper[4774]: I0127 00:10:27.122402 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:28 crc kubenswrapper[4774]: I0127 00:10:28.582946 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.038978 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.039030 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.076284 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.310661 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.652659 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.652717 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.025851 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.025933 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.525298 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ngr5v" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" containerID="cri-o://7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" gracePeriod=2 Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.696026 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" probeResult="failure" output=< Jan 27 00:10:30 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:10:30 crc kubenswrapper[4774]: > Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.864619 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.956140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"5e621715-39b3-4094-b589-ede8b74b0e8f\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.956307 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"5e621715-39b3-4094-b589-ede8b74b0e8f\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.956403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"5e621715-39b3-4094-b589-ede8b74b0e8f\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.957090 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities" (OuterVolumeSpecName: "utilities") pod "5e621715-39b3-4094-b589-ede8b74b0e8f" (UID: "5e621715-39b3-4094-b589-ede8b74b0e8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.959960 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.977092 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p" (OuterVolumeSpecName: "kube-api-access-nvt8p") pod "5e621715-39b3-4094-b589-ede8b74b0e8f" (UID: "5e621715-39b3-4094-b589-ede8b74b0e8f"). InnerVolumeSpecName "kube-api-access-nvt8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.011253 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e621715-39b3-4094-b589-ede8b74b0e8f" (UID: "5e621715-39b3-4094-b589-ede8b74b0e8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.061890 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.061955 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.081550 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" probeResult="failure" output=< Jan 27 00:10:31 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:10:31 crc kubenswrapper[4774]: > Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532002 4774 generic.go:334] "Generic (PLEG): container finished" podID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" exitCode=0 Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532059 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92"} Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532073 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532098 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"fafa96ca772c719431784a78d0a35959db893c145e408b85c8cbdfa6a0078917"} Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532118 4774 scope.go:117] "RemoveContainer" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.550018 4774 scope.go:117] "RemoveContainer" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.556727 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.559525 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.580810 4774 scope.go:117] "RemoveContainer" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.594048 4774 scope.go:117] "RemoveContainer" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" Jan 27 00:10:31 crc kubenswrapper[4774]: E0127 00:10:31.595438 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92\": container with ID starting with 7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92 not found: ID does not exist" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.595487 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92"} err="failed to get container status \"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92\": rpc error: code = NotFound desc = could not find container \"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92\": container with ID starting with 7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92 not found: ID does not exist" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.595540 4774 scope.go:117] "RemoveContainer" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" Jan 27 00:10:31 crc kubenswrapper[4774]: E0127 00:10:31.596150 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917\": container with ID starting with 99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917 not found: ID does not exist" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.596193 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917"} err="failed to get container status \"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917\": rpc error: code = NotFound desc = could not find container \"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917\": container with ID starting with 99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917 not found: ID does not exist" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.596225 4774 scope.go:117] "RemoveContainer" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" Jan 27 00:10:31 crc kubenswrapper[4774]: E0127 00:10:31.596459 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202\": container with ID starting with 5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202 not found: ID does not exist" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.596481 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202"} err="failed to get container status \"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202\": rpc error: code = NotFound desc = could not find container \"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202\": container with ID starting with 5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202 not found: ID does not exist" Jan 27 00:10:32 crc kubenswrapper[4774]: I0127 00:10:32.362402 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" path="/var/lib/kubelet/pods/5e621715-39b3-4094-b589-ede8b74b0e8f/volumes" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.675723 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.676091 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.676148 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.677200 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.677347 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85" gracePeriod=600 Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.116390 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.586232 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85" exitCode=0 Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.586593 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85"} Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.586631 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5"} Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.087820 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.710072 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.710587 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kb8hq" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" containerID="cri-o://778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" gracePeriod=2 Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.721530 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.769676 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.059666 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.067458 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.085799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.085869 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.085952 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.089456 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities" (OuterVolumeSpecName: "utilities") pod "f4f2c4e4-f111-453e-b953-cdf2528f9a3e" (UID: "f4f2c4e4-f111-453e-b953-cdf2528f9a3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.096376 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g" (OuterVolumeSpecName: "kube-api-access-57w5g") pod "f4f2c4e4-f111-453e-b953-cdf2528f9a3e" (UID: "f4f2c4e4-f111-453e-b953-cdf2528f9a3e"). InnerVolumeSpecName "kube-api-access-57w5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.110225 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.138785 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f2c4e4-f111-453e-b953-cdf2528f9a3e" (UID: "f4f2c4e4-f111-453e-b953-cdf2528f9a3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.187372 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.187402 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.187413 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601303 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" exitCode=0 Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601368 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601381 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5"} Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601430 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76"} Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601461 4774 scope.go:117] "RemoveContainer" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.626123 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.633564 4774 scope.go:117] "RemoveContainer" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.636389 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.650919 4774 scope.go:117] "RemoveContainer" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.667496 4774 scope.go:117] "RemoveContainer" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" Jan 27 00:10:40 crc kubenswrapper[4774]: E0127 00:10:40.668060 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5\": container with ID starting with 778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5 not found: ID does not exist" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668134 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5"} err="failed to get container status \"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5\": rpc error: code = NotFound desc = could not find container \"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5\": container with ID starting with 778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5 not found: ID does not exist" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668182 4774 scope.go:117] "RemoveContainer" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" Jan 27 00:10:40 crc kubenswrapper[4774]: E0127 00:10:40.668777 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29\": container with ID starting with 72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29 not found: ID does not exist" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668806 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29"} err="failed to get container status \"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29\": rpc error: code = NotFound desc = could not find container \"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29\": container with ID starting with 72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29 not found: ID does not exist" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668822 4774 scope.go:117] "RemoveContainer" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" Jan 27 00:10:40 crc kubenswrapper[4774]: E0127 00:10:40.669359 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf\": container with ID starting with 32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf not found: ID does not exist" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.669398 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf"} err="failed to get container status \"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf\": rpc error: code = NotFound desc = could not find container \"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf\": container with ID starting with 32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf not found: ID does not exist" Jan 27 00:10:41 crc kubenswrapper[4774]: I0127 00:10:41.559776 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" containerID="cri-o://d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" gracePeriod=15 Jan 27 00:10:41 crc kubenswrapper[4774]: I0127 00:10:41.914038 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:10:41 crc kubenswrapper[4774]: I0127 00:10:41.914789 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tlk6f" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" containerID="cri-o://0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" gracePeriod=2 Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.097235 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115260 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115371 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115409 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115452 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115489 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115535 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115651 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115714 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115760 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115847 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115917 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.116029 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.116568 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.116991 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.117121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.117570 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.117752 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.129445 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp" (OuterVolumeSpecName: "kube-api-access-mh9kp") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "kube-api-access-mh9kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.130259 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.131525 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.132587 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.149139 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.151023 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.153058 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.154749 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.160200 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218386 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218441 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218462 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218481 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218494 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218508 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218522 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218538 4774 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218553 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218566 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218581 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218593 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218606 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218622 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.315384 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.368829 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" path="/var/lib/kubelet/pods/f4f2c4e4-f111-453e-b953-cdf2528f9a3e/volumes" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.419926 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.419998 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.420053 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.421324 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities" (OuterVolumeSpecName: "utilities") pod "b5f7850c-533c-48e5-bead-ddc0e7ba8d83" (UID: "b5f7850c-533c-48e5-bead-ddc0e7ba8d83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.424249 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl" (OuterVolumeSpecName: "kube-api-access-g24zl") pod "b5f7850c-533c-48e5-bead-ddc0e7ba8d83" (UID: "b5f7850c-533c-48e5-bead-ddc0e7ba8d83"). InnerVolumeSpecName "kube-api-access-g24zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.448553 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5f7850c-533c-48e5-bead-ddc0e7ba8d83" (UID: "b5f7850c-533c-48e5-bead-ddc0e7ba8d83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.521445 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.521492 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.521507 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615774 4774 generic.go:334] "Generic (PLEG): container finished" podID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" exitCode=0 Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615883 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerDied","Data":"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615931 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615986 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerDied","Data":"6c6a9cbe32487a19776f84a95c618edb7f8a9a884ffdc154828b802d8fe8a819"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.616025 4774 scope.go:117] "RemoveContainer" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.625958 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" exitCode=0 Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.626032 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.626073 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.626081 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"ce3bedf60ecff127b9cf54aca7f49498f367bae4df2ca706885feb215a762b62"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.645547 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.650752 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.663880 4774 scope.go:117] "RemoveContainer" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.664682 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973\": container with ID starting with d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973 not found: ID does not exist" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.664840 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973"} err="failed to get container status \"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973\": rpc error: code = NotFound desc = could not find container \"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973\": container with ID starting with d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973 not found: ID does not exist" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.665022 4774 scope.go:117] "RemoveContainer" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.668193 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.670437 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.683602 4774 scope.go:117] "RemoveContainer" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.698149 4774 scope.go:117] "RemoveContainer" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.712590 4774 scope.go:117] "RemoveContainer" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.713028 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386\": container with ID starting with 0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386 not found: ID does not exist" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713059 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386"} err="failed to get container status \"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386\": rpc error: code = NotFound desc = could not find container \"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386\": container with ID starting with 0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386 not found: ID does not exist" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713086 4774 scope.go:117] "RemoveContainer" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.713387 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb\": container with ID starting with 29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb not found: ID does not exist" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713411 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb"} err="failed to get container status \"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb\": rpc error: code = NotFound desc = could not find container \"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb\": container with ID starting with 29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb not found: ID does not exist" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713426 4774 scope.go:117] "RemoveContainer" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.713894 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731\": container with ID starting with 7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731 not found: ID does not exist" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713986 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731"} err="failed to get container status \"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731\": rpc error: code = NotFound desc = could not find container \"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731\": container with ID starting with 7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731 not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.111066 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.111630 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" containerID="cri-o://c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" gracePeriod=2 Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.195852 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e47ec75_abd8_41d4_a3c8_e722d21a305f.slice/crio-c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49.scope\": RecentStats: unable to find data in memory cache]" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.366641 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" path="/var/lib/kubelet/pods/b5f7850c-533c-48e5-bead-ddc0e7ba8d83/volumes" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.368359 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" path="/var/lib/kubelet/pods/f8bb8619-eac4-481e-bdb8-5fb5985c1844/volumes" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.575488 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644015 4774 generic.go:334] "Generic (PLEG): container finished" podID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" exitCode=0 Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644064 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49"} Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644101 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c"} Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644126 4774 scope.go:117] "RemoveContainer" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644264 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.649749 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.650494 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.650660 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.653235 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities" (OuterVolumeSpecName: "utilities") pod "6e47ec75-abd8-41d4-a3c8-e722d21a305f" (UID: "6e47ec75-abd8-41d4-a3c8-e722d21a305f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.660784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm" (OuterVolumeSpecName: "kube-api-access-w8rvm") pod "6e47ec75-abd8-41d4-a3c8-e722d21a305f" (UID: "6e47ec75-abd8-41d4-a3c8-e722d21a305f"). InnerVolumeSpecName "kube-api-access-w8rvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.667664 4774 scope.go:117] "RemoveContainer" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.688688 4774 scope.go:117] "RemoveContainer" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.704935 4774 scope.go:117] "RemoveContainer" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.705351 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49\": container with ID starting with c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49 not found: ID does not exist" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705396 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49"} err="failed to get container status \"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49\": rpc error: code = NotFound desc = could not find container \"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49\": container with ID starting with c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49 not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705430 4774 scope.go:117] "RemoveContainer" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.705850 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4\": container with ID starting with 5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4 not found: ID does not exist" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705915 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4"} err="failed to get container status \"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4\": rpc error: code = NotFound desc = could not find container \"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4\": container with ID starting with 5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4 not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705935 4774 scope.go:117] "RemoveContainer" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.706264 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d\": container with ID starting with ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d not found: ID does not exist" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.706296 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d"} err="failed to get container status \"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d\": rpc error: code = NotFound desc = could not find container \"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d\": container with ID starting with ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.753522 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.753575 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.796504 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e47ec75-abd8-41d4-a3c8-e722d21a305f" (UID: "6e47ec75-abd8-41d4-a3c8-e722d21a305f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.854571 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.991355 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.996638 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:10:46 crc kubenswrapper[4774]: I0127 00:10:46.365278 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" path="/var/lib/kubelet/pods/6e47ec75-abd8-41d4-a3c8-e722d21a305f/volumes" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633433 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8445cf6b-22hmn"] Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633671 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633686 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633700 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633709 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633722 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633731 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633746 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633754 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633767 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633775 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633792 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633802 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633820 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633831 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633843 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633882 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633897 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633911 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633925 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633935 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633954 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633964 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633981 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633993 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.634004 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634014 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634164 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634187 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634211 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634224 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634241 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634754 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.639360 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.640290 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.640565 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.640853 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.641140 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.641495 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642156 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642286 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642498 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642767 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.644965 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.648233 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.656368 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8445cf6b-22hmn"] Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.657841 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.661036 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.662688 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712198 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-policies\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712257 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rmn\" (UniqueName: \"kubernetes.io/projected/ba9690af-8b7d-4404-93d7-87b2de9bd203-kube-api-access-b5rmn\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712306 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712332 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-login\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712364 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712435 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712480 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-dir\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712547 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712571 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-router-certs\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712598 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712623 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-error\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712649 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-session\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712730 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-service-ca\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814476 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-policies\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814533 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rmn\" (UniqueName: \"kubernetes.io/projected/ba9690af-8b7d-4404-93d7-87b2de9bd203-kube-api-access-b5rmn\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814569 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-login\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814586 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814609 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814635 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-dir\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814694 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814712 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-router-certs\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814750 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814769 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-error\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814785 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-session\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814802 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-service-ca\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.815750 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-service-ca\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.816096 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-policies\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.817150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.817833 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-dir\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.819147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.822780 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-router-certs\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.822951 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-login\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.823074 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.824576 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.824618 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.824710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-session\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.826437 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.827617 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-error\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.839117 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rmn\" (UniqueName: \"kubernetes.io/projected/ba9690af-8b7d-4404-93d7-87b2de9bd203-kube-api-access-b5rmn\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.983919 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.265833 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8445cf6b-22hmn"] Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.691294 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" event={"ID":"ba9690af-8b7d-4404-93d7-87b2de9bd203","Type":"ContainerStarted","Data":"d5c16dbbc94e1e3e428a80a4db9bfb7a6af9932fdaed30a170c2fa3ef5cb0ad2"} Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.691362 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" event={"ID":"ba9690af-8b7d-4404-93d7-87b2de9bd203","Type":"ContainerStarted","Data":"1cf87edd0cc0b54623f996856a7b9303f51103b7b1bc70d8d71d7c00339b3fa0"} Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.691530 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.693257 4774 patch_prober.go:28] interesting pod/oauth-openshift-8445cf6b-22hmn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" start-of-body= Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.693303 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" podUID="ba9690af-8b7d-4404-93d7-87b2de9bd203" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.712890 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" podStartSLOduration=33.71285468 podStartE2EDuration="33.71285468s" podCreationTimestamp="2026-01-27 00:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:49.709099023 +0000 UTC m=+228.014875907" watchObservedRunningTime="2026-01-27 00:10:49.71285468 +0000 UTC m=+228.018631564" Jan 27 00:10:50 crc kubenswrapper[4774]: I0127 00:10:50.700061 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.572736 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.574175 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575233 4774 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575572 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575602 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575752 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575831 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575896 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578278 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578539 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578552 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578565 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578573 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578582 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578588 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578597 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578603 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578610 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578615 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578624 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578630 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578639 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578646 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578742 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578756 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578768 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578780 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578790 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578799 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578911 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578921 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.579030 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.640671 4774 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659123 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659294 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659349 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.660026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.767899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.767941 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.767974 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768008 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768036 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768072 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768089 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768203 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768224 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768251 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768280 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.770517 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.772766 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773551 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773568 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773576 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773583 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" exitCode=2 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773643 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.785912 4774 generic.go:334] "Generic (PLEG): container finished" podID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerID="05e7a9c90f1cfc8896ce47e0dacfdae023353b22d601861957e2c1280d6e97a3" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.785983 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerDied","Data":"05e7a9c90f1cfc8896ce47e0dacfdae023353b22d601861957e2c1280d6e97a3"} Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.787534 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.788368 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869648 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869704 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869792 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869803 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.943082 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.981134 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e6e021e22ae84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,LastTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:10:59 crc kubenswrapper[4774]: E0127 00:10:59.443384 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e6e021e22ae84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,LastTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.794232 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2"} Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.794322 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"09beeb18b226da275eae17f55abb5dddd6b7c6f8844e81dd809b2041a4715d87"} Jan 27 00:10:59 crc kubenswrapper[4774]: E0127 00:10:59.795394 4774 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.796697 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.797521 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.800673 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.192038 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.193095 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.194000 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293046 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293116 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293159 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293163 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" (UID: "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293417 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293403 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock" (OuterVolumeSpecName: "var-lock") pod "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" (UID: "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.300704 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" (UID: "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.394292 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.394621 4774 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.829157 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerDied","Data":"ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa"} Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.829226 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.829313 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.834580 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.953661 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.954611 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.955157 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.955521 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104534 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104652 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104672 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104748 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104766 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104901 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.105175 4774 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.105205 4774 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.105217 4774 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.843686 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.845100 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" exitCode=0 Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.845176 4774 scope.go:117] "RemoveContainer" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.845456 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.865100 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.865548 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.875528 4774 scope.go:117] "RemoveContainer" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.891114 4774 scope.go:117] "RemoveContainer" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.913393 4774 scope.go:117] "RemoveContainer" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.936262 4774 scope.go:117] "RemoveContainer" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.967759 4774 scope.go:117] "RemoveContainer" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.991489 4774 scope.go:117] "RemoveContainer" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.992357 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\": container with ID starting with 03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a not found: ID does not exist" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.992408 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a"} err="failed to get container status \"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\": rpc error: code = NotFound desc = could not find container \"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\": container with ID starting with 03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.992444 4774 scope.go:117] "RemoveContainer" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.993013 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\": container with ID starting with c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba not found: ID does not exist" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.993039 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba"} err="failed to get container status \"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\": rpc error: code = NotFound desc = could not find container \"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\": container with ID starting with c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.993055 4774 scope.go:117] "RemoveContainer" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.995132 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\": container with ID starting with 00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d not found: ID does not exist" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.995181 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d"} err="failed to get container status \"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\": rpc error: code = NotFound desc = could not find container \"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\": container with ID starting with 00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.995236 4774 scope.go:117] "RemoveContainer" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.996003 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\": container with ID starting with 0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203 not found: ID does not exist" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996030 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203"} err="failed to get container status \"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\": rpc error: code = NotFound desc = could not find container \"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\": container with ID starting with 0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203 not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996048 4774 scope.go:117] "RemoveContainer" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.996389 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\": container with ID starting with 2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98 not found: ID does not exist" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996421 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98"} err="failed to get container status \"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\": rpc error: code = NotFound desc = could not find container \"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\": container with ID starting with 2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98 not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996463 4774 scope.go:117] "RemoveContainer" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.997016 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\": container with ID starting with b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea not found: ID does not exist" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.997038 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea"} err="failed to get container status \"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\": rpc error: code = NotFound desc = could not find container \"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\": container with ID starting with b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea not found: ID does not exist" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.008612 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.009218 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.009666 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.011032 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.011705 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.011788 4774 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.012290 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="200ms" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.214248 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="400ms" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.358493 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.358811 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.362976 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.615084 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="800ms" Jan 27 00:11:03 crc kubenswrapper[4774]: E0127 00:11:03.416625 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="1.6s" Jan 27 00:11:05 crc kubenswrapper[4774]: E0127 00:11:05.020221 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="3.2s" Jan 27 00:11:08 crc kubenswrapper[4774]: E0127 00:11:08.221442 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="6.4s" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.357623 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.359834 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.379972 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.380058 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: E0127 00:11:09.380636 4774 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.381306 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: W0127 00:11:09.428736 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd WatchSource:0}: Error finding container a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd: Status 404 returned error can't find the container with id a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd Jan 27 00:11:09 crc kubenswrapper[4774]: E0127 00:11:09.445776 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e6e021e22ae84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,LastTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.914497 4774 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bc27f6ca9a0b7ed3fe2850a1707feaeaa157fc862b1e65188fbbf91a7c9ea542" exitCode=0 Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.914600 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bc27f6ca9a0b7ed3fe2850a1707feaeaa157fc862b1e65188fbbf91a7c9ea542"} Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.915732 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd"} Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.916427 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.916462 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: E0127 00:11:09.917167 4774 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.917178 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.842691 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.843174 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.934646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dfd2a33eeb4aa36e42ad8aff35ecce104aee6e7b9baca6b4d497997968743be3"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.934723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77e714e2f02139e503acd2d790cb3eecf1694e951047d996b7024d01ac261e79"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.934741 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"21b062e8c053cd29472647f0280ccb7d34b7646039e3bdc0ed57b8bad7fb9c0b"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.939889 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.939954 4774 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78" exitCode=1 Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.940004 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.940708 4774 scope.go:117] "RemoveContainer" containerID="3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.952179 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.952497 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c"} Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957016 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"41dcd7de625f79174e76b9152db0462b80dd3b5317720b3b2640da26297b4d8a"} Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957102 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85ebfcfdcc132ec8a43257ac4dbc0c76d1f12a6df553ce3bc9cdcc3dd0ae7e31"} Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957207 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957247 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957272 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:14 crc kubenswrapper[4774]: I0127 00:11:14.382210 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:14 crc kubenswrapper[4774]: I0127 00:11:14.382282 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:14 crc kubenswrapper[4774]: I0127 00:11:14.390664 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:15 crc kubenswrapper[4774]: I0127 00:11:15.463353 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.968432 4774 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.994152 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.994178 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.999601 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:17 crc kubenswrapper[4774]: I0127 00:11:17.003476 4774 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="282a1d53-c7c3-4138-866e-4d597661333a" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.000704 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.000749 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.254821 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.255119 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.255186 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:22 crc kubenswrapper[4774]: I0127 00:11:22.377961 4774 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="282a1d53-c7c3-4138-866e-4d597661333a" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.141787 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.266488 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.447491 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.635783 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.958777 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.295285 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.423484 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.599106 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.779545 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.928351 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.933160 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.950205 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.019048 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.335846 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.545134 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.730457 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.971040 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.110904 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.111287 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.266171 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.301274 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.464311 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.861924 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.078367 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.442591 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.566144 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.580955 4774 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.659697 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.025502 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.194144 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.255173 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.255288 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.744894 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 00:11:29 crc kubenswrapper[4774]: I0127 00:11:29.452191 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 00:11:29 crc kubenswrapper[4774]: I0127 00:11:29.517235 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 00:11:29 crc kubenswrapper[4774]: I0127 00:11:29.859666 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.327845 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.537497 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.666683 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.753928 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.908576 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.912852 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.949238 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.150084 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.159212 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.532289 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.533472 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.593750 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.750145 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.136969 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.280061 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.340448 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.451147 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.484036 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.638225 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.658903 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.690835 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.825612 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.883714 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.973927 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.044673 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.338698 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.499799 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.596334 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.614264 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.664924 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.693120 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.712749 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.753032 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.804251 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.938556 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.938733 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.955050 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.001083 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.085414 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.153947 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.200452 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.298578 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.442781 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.510795 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.624756 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.682034 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.687321 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.688907 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.751819 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.801434 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.815755 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.920516 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.933749 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.969023 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.986331 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.087080 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.119498 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.212131 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.214045 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.224222 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.289369 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.378468 4774 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.389022 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.389095 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.395466 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.396982 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.435474 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.436568 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.436545893999998 podStartE2EDuration="19.436545894s" podCreationTimestamp="2026-01-27 00:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:11:35.414832771 +0000 UTC m=+273.720609655" watchObservedRunningTime="2026-01-27 00:11:35.436545894 +0000 UTC m=+273.742322768" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.520150 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.608620 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.666795 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.669266 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.674912 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.791427 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.855178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.871781 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.878668 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.894667 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.983833 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.990371 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.011801 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.083416 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.326670 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.375797 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.502659 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.545452 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.643319 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.663690 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.674852 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.745849 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.877315 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.937990 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.024517 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.129170 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.129736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.327571 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.363593 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.396491 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.403136 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.420633 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.515394 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.618903 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.688589 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.704385 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.740796 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.816772 4774 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.882318 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.926090 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.930012 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.985385 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.129329 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.141270 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.149644 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.165812 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.233087 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.248511 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.254767 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.254900 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.254983 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.255801 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.255980 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c" gracePeriod=30 Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.263787 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.288692 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.308226 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.344481 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.407155 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.433609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.575720 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.606743 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.632521 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.646872 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.664743 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.822659 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.827608 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.832778 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.928944 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.966944 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.979900 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.113422 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.132710 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.139040 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.258201 4774 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.294597 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.404809 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.477626 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.478326 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.491223 4774 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.491441 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" gracePeriod=5 Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.881964 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.913236 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.059501 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.147838 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.153818 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.191249 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.298889 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.328035 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.401095 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.422051 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.562812 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.705316 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.734811 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.792232 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.872945 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.004919 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.042003 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.053052 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.075763 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.141238 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.220247 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.281399 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.317905 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.358140 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.413396 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.452469 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.487083 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.489976 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.511908 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.687287 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.728769 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.777549 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.779845 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.866986 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.941349 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.981046 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.005790 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.018193 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.035206 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.040160 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.157587 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.171318 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.215266 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.263616 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.324532 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.339022 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.364391 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.378120 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.490526 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.549439 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.859897 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.891456 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.998003 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.033458 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.065654 4774 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.120212 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.195313 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.196086 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.480905 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.868303 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.977634 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.985130 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.039796 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.139572 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.256930 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.444753 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.593529 4774 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.639707 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.795625 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.972476 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.973957 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.062676 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.062746 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.070016 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.138295 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.167205 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.167905 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.167949 4774 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" exitCode=137 Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.168005 4774 scope.go:117] "RemoveContainer" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.168043 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.198028 4774 scope.go:117] "RemoveContainer" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" Jan 27 00:11:45 crc kubenswrapper[4774]: E0127 00:11:45.198919 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2\": container with ID starting with 274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2 not found: ID does not exist" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.198977 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2"} err="failed to get container status \"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2\": rpc error: code = NotFound desc = could not find container \"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2\": container with ID starting with 274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2 not found: ID does not exist" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.206352 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216758 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216804 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216844 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216892 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216917 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217002 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217067 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217093 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217181 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217566 4774 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217613 4774 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217626 4774 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217638 4774 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.225548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.250518 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.318675 4774 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.724344 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.920595 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 00:11:46 crc kubenswrapper[4774]: I0127 00:11:46.020589 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 00:11:46 crc kubenswrapper[4774]: I0127 00:11:46.241946 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 00:11:46 crc kubenswrapper[4774]: I0127 00:11:46.363563 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 27 00:11:48 crc kubenswrapper[4774]: I0127 00:11:48.032977 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 00:12:02 crc kubenswrapper[4774]: I0127 00:12:02.145743 4774 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.316393 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318690 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318742 4774 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c" exitCode=137 Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c"} Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318800 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b5a2ea12dbf518bffc634af342a252a1cefda672b27fbb48f778587800cc58c7"} Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318818 4774 scope.go:117] "RemoveContainer" containerID="3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78" Jan 27 00:12:10 crc kubenswrapper[4774]: I0127 00:12:10.326189 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 27 00:12:15 crc kubenswrapper[4774]: I0127 00:12:15.463442 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:18 crc kubenswrapper[4774]: I0127 00:12:18.254382 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:18 crc kubenswrapper[4774]: I0127 00:12:18.259924 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:25 crc kubenswrapper[4774]: I0127 00:12:25.467667 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.444673 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.445933 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" containerID="cri-o://6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" gracePeriod=30 Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.455873 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.456442 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" containerID="cri-o://475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" gracePeriod=30 Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.866569 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.871968 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.944997 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945069 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945105 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945125 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945146 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945312 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945329 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945360 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945854 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945900 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca" (OuterVolumeSpecName: "client-ca") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.946634 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config" (OuterVolumeSpecName: "config") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.946732 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca" (OuterVolumeSpecName: "client-ca") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.947079 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config" (OuterVolumeSpecName: "config") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.950795 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.950894 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.951266 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx" (OuterVolumeSpecName: "kube-api-access-4lxgx") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "kube-api-access-4lxgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.951348 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps" (OuterVolumeSpecName: "kube-api-access-npcps") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "kube-api-access-npcps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046176 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046213 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046222 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046230 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046239 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046249 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046257 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046265 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046274 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447793 4774 generic.go:334] "Generic (PLEG): container finished" podID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" exitCode=0 Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447875 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerDied","Data":"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerDied","Data":"20e7f45e0bc628b7de64cc93ac19fdef43b151e10efd40aab961cc3102eb973d"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447958 4774 scope.go:117] "RemoveContainer" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.448623 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.448970 4774 generic.go:334] "Generic (PLEG): container finished" podID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" exitCode=0 Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.448993 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerDied","Data":"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.449036 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerDied","Data":"63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.449151 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.465284 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.466065 4774 scope.go:117] "RemoveContainer" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.466548 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c\": container with ID starting with 475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c not found: ID does not exist" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.466595 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c"} err="failed to get container status \"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c\": rpc error: code = NotFound desc = could not find container \"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c\": container with ID starting with 475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c not found: ID does not exist" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.466623 4774 scope.go:117] "RemoveContainer" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.469475 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.479330 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.480119 4774 scope.go:117] "RemoveContainer" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.480457 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded\": container with ID starting with 6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded not found: ID does not exist" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.480493 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded"} err="failed to get container status \"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded\": rpc error: code = NotFound desc = could not find container \"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded\": container with ID starting with 6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded not found: ID does not exist" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.482933 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.698998 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-759c8d74f8-htq9z"] Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699620 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699662 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699690 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699705 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699727 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerName="installer" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699741 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerName="installer" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699782 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699796 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700014 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerName="installer" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700037 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700058 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700083 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.701009 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703715 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703847 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703898 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703937 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.704790 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.705352 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.705899 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.706328 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.713674 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.715989 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716153 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716198 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716283 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716361 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.722687 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-759c8d74f8-htq9z"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759112 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5202c840-a2f1-48c0-884f-9c74c38daa3b-serving-cert\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759188 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-config\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrgf9\" (UniqueName: \"kubernetes.io/projected/5202c840-a2f1-48c0-884f-9c74c38daa3b-kube-api-access-nrgf9\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759407 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-proxy-ca-bundles\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759459 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-client-ca\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.762746 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.809380 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861419 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5202c840-a2f1-48c0-884f-9c74c38daa3b-serving-cert\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861559 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-config\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrgf9\" (UniqueName: \"kubernetes.io/projected/5202c840-a2f1-48c0-884f-9c74c38daa3b-kube-api-access-nrgf9\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861622 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861655 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861724 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-proxy-ca-bundles\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861763 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-client-ca\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861824 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.864137 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-config\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.864157 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-client-ca\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.864254 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-proxy-ca-bundles\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.884929 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5202c840-a2f1-48c0-884f-9c74c38daa3b-serving-cert\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.886674 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrgf9\" (UniqueName: \"kubernetes.io/projected/5202c840-a2f1-48c0-884f-9c74c38daa3b-kube-api-access-nrgf9\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963388 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963443 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963463 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963520 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.964351 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.964564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.967014 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.979371 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.063352 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.084449 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.380410 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.455071 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerStarted","Data":"a889b3e4423ddcb596fb765d9b93f601d997c73d3159d03696ff4a486f6a59c1"} Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.516934 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-759c8d74f8-htq9z"] Jan 27 00:12:29 crc kubenswrapper[4774]: W0127 00:12:29.524211 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5202c840_a2f1_48c0_884f_9c74c38daa3b.slice/crio-71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61 WatchSource:0}: Error finding container 71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61: Status 404 returned error can't find the container with id 71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61 Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.363413 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" path="/var/lib/kubelet/pods/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff/volumes" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.364299 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" path="/var/lib/kubelet/pods/9166b710-4bb0-4fc0-8e54-45907543c22f/volumes" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.462606 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" event={"ID":"5202c840-a2f1-48c0-884f-9c74c38daa3b","Type":"ContainerStarted","Data":"f39e82d9fd3aa879ae91733f29d3a052848b6be8772b3db358d6519035089c4d"} Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.462650 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" event={"ID":"5202c840-a2f1-48c0-884f-9c74c38daa3b","Type":"ContainerStarted","Data":"71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61"} Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.462844 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.465668 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerStarted","Data":"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b"} Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.465925 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.470787 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.472584 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.491278 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" podStartSLOduration=3.491261171 podStartE2EDuration="3.491261171s" podCreationTimestamp="2026-01-27 00:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:12:30.485657572 +0000 UTC m=+328.791434456" watchObservedRunningTime="2026-01-27 00:12:30.491261171 +0000 UTC m=+328.797038055" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.509192 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" podStartSLOduration=3.50917262 podStartE2EDuration="3.50917262s" podCreationTimestamp="2026-01-27 00:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:12:30.504786042 +0000 UTC m=+328.810562936" watchObservedRunningTime="2026-01-27 00:12:30.50917262 +0000 UTC m=+328.814949494" Jan 27 00:12:36 crc kubenswrapper[4774]: I0127 00:12:36.675613 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:12:36 crc kubenswrapper[4774]: I0127 00:12:36.677349 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.136592 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hdn9m"] Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.137671 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.147998 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hdn9m"] Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.275552 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.275720 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-registry-tls\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276422 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-registry-certificates\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276637 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7737e34-de67-4910-b082-a754145009df-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276711 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7737e34-de67-4910-b082-a754145009df-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276760 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-bound-sa-token\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276805 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-trusted-ca\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276845 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x49b\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-kube-api-access-4x49b\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.312836 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379521 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-registry-certificates\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7737e34-de67-4910-b082-a754145009df-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7737e34-de67-4910-b082-a754145009df-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-bound-sa-token\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379840 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-trusted-ca\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379870 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x49b\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-kube-api-access-4x49b\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379919 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-registry-tls\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.380303 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7737e34-de67-4910-b082-a754145009df-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.381059 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-registry-certificates\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.381176 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-trusted-ca\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.394680 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-registry-tls\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.394678 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7737e34-de67-4910-b082-a754145009df-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.396714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-bound-sa-token\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.398223 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x49b\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-kube-api-access-4x49b\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.454126 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.853475 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hdn9m"] Jan 27 00:12:52 crc kubenswrapper[4774]: W0127 00:12:52.862704 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7737e34_de67_4910_b082_a754145009df.slice/crio-1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c WatchSource:0}: Error finding container 1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c: Status 404 returned error can't find the container with id 1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.588567 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" event={"ID":"d7737e34-de67-4910-b082-a754145009df","Type":"ContainerStarted","Data":"a8252c2aab93f4db7b8b32b5a073eda6d90ee2c2f1951b46fe52073babd45079"} Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.588629 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" event={"ID":"d7737e34-de67-4910-b082-a754145009df","Type":"ContainerStarted","Data":"1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c"} Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.588725 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.604767 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" podStartSLOduration=1.60474297 podStartE2EDuration="1.60474297s" podCreationTimestamp="2026-01-27 00:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:12:53.603154688 +0000 UTC m=+351.908931592" watchObservedRunningTime="2026-01-27 00:12:53.60474297 +0000 UTC m=+351.910519864" Jan 27 00:13:06 crc kubenswrapper[4774]: I0127 00:13:06.675544 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:13:06 crc kubenswrapper[4774]: I0127 00:13:06.676606 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:13:07 crc kubenswrapper[4774]: I0127 00:13:07.780497 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:13:07 crc kubenswrapper[4774]: I0127 00:13:07.781328 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" containerID="cri-o://a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" gracePeriod=30 Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.135578 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.212047 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.212630 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.213851 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.214336 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca" (OuterVolumeSpecName: "client-ca") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.214469 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config" (OuterVolumeSpecName: "config") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.214487 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.215116 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.215575 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.221045 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.222665 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7" (OuterVolumeSpecName: "kube-api-access-qjrc7") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "kube-api-access-qjrc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.316669 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.316718 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672235 4774 generic.go:334] "Generic (PLEG): container finished" podID="4240439b-6b80-4a41-960f-8629057b254a" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" exitCode=0 Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672304 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerDied","Data":"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b"} Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672346 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672368 4774 scope.go:117] "RemoveContainer" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672351 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerDied","Data":"a889b3e4423ddcb596fb765d9b93f601d997c73d3159d03696ff4a486f6a59c1"} Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.693383 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.693684 4774 scope.go:117] "RemoveContainer" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" Jan 27 00:13:08 crc kubenswrapper[4774]: E0127 00:13:08.694378 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b\": container with ID starting with a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b not found: ID does not exist" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.694480 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b"} err="failed to get container status \"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b\": rpc error: code = NotFound desc = could not find container \"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b\": container with ID starting with a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b not found: ID does not exist" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.696781 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.933440 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb"] Jan 27 00:13:08 crc kubenswrapper[4774]: E0127 00:13:08.933940 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.933966 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.934194 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.934970 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.937188 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.939670 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940009 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940250 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940376 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940463 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.943228 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb"] Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.026914 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-client-ca\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.026994 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbw4k\" (UniqueName: \"kubernetes.io/projected/b5957fa6-b497-4d76-9577-6b17bc66a6f4-kube-api-access-fbw4k\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.027035 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5957fa6-b497-4d76-9577-6b17bc66a6f4-serving-cert\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.027065 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-config\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.128832 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-client-ca\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.129177 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbw4k\" (UniqueName: \"kubernetes.io/projected/b5957fa6-b497-4d76-9577-6b17bc66a6f4-kube-api-access-fbw4k\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.129336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5957fa6-b497-4d76-9577-6b17bc66a6f4-serving-cert\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.129433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-config\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.131144 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-config\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.131147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-client-ca\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.139573 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5957fa6-b497-4d76-9577-6b17bc66a6f4-serving-cert\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.147229 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbw4k\" (UniqueName: \"kubernetes.io/projected/b5957fa6-b497-4d76-9577-6b17bc66a6f4-kube-api-access-fbw4k\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.257215 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.669386 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb"] Jan 27 00:13:09 crc kubenswrapper[4774]: W0127 00:13:09.689606 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5957fa6_b497_4d76_9577_6b17bc66a6f4.slice/crio-12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5 WatchSource:0}: Error finding container 12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5: Status 404 returned error can't find the container with id 12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5 Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.364254 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4240439b-6b80-4a41-960f-8629057b254a" path="/var/lib/kubelet/pods/4240439b-6b80-4a41-960f-8629057b254a/volumes" Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.710402 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" event={"ID":"b5957fa6-b497-4d76-9577-6b17bc66a6f4","Type":"ContainerStarted","Data":"c9125167004f2e4cacd5db93979dd557dab8dd6060e6b71674fa8b5880547acb"} Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.710468 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" event={"ID":"b5957fa6-b497-4d76-9577-6b17bc66a6f4","Type":"ContainerStarted","Data":"12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5"} Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.710994 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.717502 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.759840 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" podStartSLOduration=3.759815139 podStartE2EDuration="3.759815139s" podCreationTimestamp="2026-01-27 00:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:13:10.738925321 +0000 UTC m=+369.044702215" watchObservedRunningTime="2026-01-27 00:13:10.759815139 +0000 UTC m=+369.065592043" Jan 27 00:13:12 crc kubenswrapper[4774]: I0127 00:13:12.464030 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:13:12 crc kubenswrapper[4774]: I0127 00:13:12.569680 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.899269 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.900832 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pj24q" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" containerID="cri-o://4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.928526 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.929023 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnlbr" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" containerID="cri-o://5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.941843 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.942101 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" containerID="cri-o://b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.951607 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.951887 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ds9fx" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" containerID="cri-o://d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.966143 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.966550 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" containerID="cri-o://e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.973696 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcz6w"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.974570 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.977515 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcz6w"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.030946 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wv2t\" (UniqueName: \"kubernetes.io/projected/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-kube-api-access-9wv2t\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.031135 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.031278 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.132115 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.132435 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.132618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wv2t\" (UniqueName: \"kubernetes.io/projected/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-kube-api-access-9wv2t\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.134041 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.142085 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.154996 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wv2t\" (UniqueName: \"kubernetes.io/projected/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-kube-api-access-9wv2t\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.347088 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.365045 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.437461 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"af002227-deb6-4a24-8ce5-051f93bc178b\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.442561 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"af002227-deb6-4a24-8ce5-051f93bc178b\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.442718 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"af002227-deb6-4a24-8ce5-051f93bc178b\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.450763 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities" (OuterVolumeSpecName: "utilities") pod "af002227-deb6-4a24-8ce5-051f93bc178b" (UID: "af002227-deb6-4a24-8ce5-051f93bc178b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.454009 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9" (OuterVolumeSpecName: "kube-api-access-qlfv9") pod "af002227-deb6-4a24-8ce5-051f93bc178b" (UID: "af002227-deb6-4a24-8ce5-051f93bc178b"). InnerVolumeSpecName "kube-api-access-qlfv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.455944 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.465637 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.494774 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.500385 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.545917 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"6cad24dd-acc4-40e5-8380-e0d74be79921\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.545975 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546020 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"386f196b-c4bc-4fea-924b-c0487a352310\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546045 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"386f196b-c4bc-4fea-924b-c0487a352310\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546130 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"6cad24dd-acc4-40e5-8380-e0d74be79921\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546188 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546205 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"386f196b-c4bc-4fea-924b-c0487a352310\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546237 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"6cad24dd-acc4-40e5-8380-e0d74be79921\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546262 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546290 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546319 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546527 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546548 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.547212 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities" (OuterVolumeSpecName: "utilities") pod "687ff3b2-f773-482b-9bf7-1b6135b4d6ac" (UID: "687ff3b2-f773-482b-9bf7-1b6135b4d6ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.548140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities" (OuterVolumeSpecName: "utilities") pod "6cad24dd-acc4-40e5-8380-e0d74be79921" (UID: "6cad24dd-acc4-40e5-8380-e0d74be79921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.551332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr" (OuterVolumeSpecName: "kube-api-access-xh2lr") pod "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" (UID: "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72"). InnerVolumeSpecName "kube-api-access-xh2lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.552124 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities" (OuterVolumeSpecName: "utilities") pod "386f196b-c4bc-4fea-924b-c0487a352310" (UID: "386f196b-c4bc-4fea-924b-c0487a352310"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.554323 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6" (OuterVolumeSpecName: "kube-api-access-7bbh6") pod "6cad24dd-acc4-40e5-8380-e0d74be79921" (UID: "6cad24dd-acc4-40e5-8380-e0d74be79921"). InnerVolumeSpecName "kube-api-access-7bbh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.560969 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" (UID: "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.573295 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x" (OuterVolumeSpecName: "kube-api-access-vk44x") pod "386f196b-c4bc-4fea-924b-c0487a352310" (UID: "386f196b-c4bc-4fea-924b-c0487a352310"). InnerVolumeSpecName "kube-api-access-vk44x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.574401 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42" (OuterVolumeSpecName: "kube-api-access-lcf42") pod "687ff3b2-f773-482b-9bf7-1b6135b4d6ac" (UID: "687ff3b2-f773-482b-9bf7-1b6135b4d6ac"). InnerVolumeSpecName "kube-api-access-lcf42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.575548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" (UID: "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.589664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "687ff3b2-f773-482b-9bf7-1b6135b4d6ac" (UID: "687ff3b2-f773-482b-9bf7-1b6135b4d6ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.590465 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af002227-deb6-4a24-8ce5-051f93bc178b" (UID: "af002227-deb6-4a24-8ce5-051f93bc178b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.619328 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cad24dd-acc4-40e5-8380-e0d74be79921" (UID: "6cad24dd-acc4-40e5-8380-e0d74be79921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.626075 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcz6w"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648261 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648846 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648873 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648886 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648898 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648908 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648917 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648928 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648941 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648950 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648960 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648969 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.682582 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "386f196b-c4bc-4fea-924b-c0487a352310" (UID: "386f196b-c4bc-4fea-924b-c0487a352310"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.750419 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788034 4774 generic.go:334] "Generic (PLEG): container finished" podID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788134 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788183 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788177 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788226 4774 scope.go:117] "RemoveContainer" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796105 4774 generic.go:334] "Generic (PLEG): container finished" podID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796205 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796254 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"7c895fbfa4a8556de4f63281aed06489f2f81ec44684ed968eb3615baa9f463f"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796344 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.803918 4774 generic.go:334] "Generic (PLEG): container finished" podID="af002227-deb6-4a24-8ce5-051f93bc178b" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.803990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.804022 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"f9e3168af24b8a48492cc6713c61bfd5396824e8a18842b10c09f8a37f4eed28"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.804113 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.811399 4774 scope.go:117] "RemoveContainer" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814180 4774 generic.go:334] "Generic (PLEG): container finished" podID="386f196b-c4bc-4fea-924b-c0487a352310" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814286 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814336 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"8a4f7dc871614277fda5710d46e4863b99158a3773f029cd73a7cc991db5ffae"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814452 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819085 4774 generic.go:334] "Generic (PLEG): container finished" podID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819183 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819295 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerDied","Data":"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819362 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerDied","Data":"68dbadf6774272234ad63109a60f8646d78e5ffd2b9aca24a0498e0100dbeb4f"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.823376 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" event={"ID":"fb20bbdd-0a13-4d07-8c4a-8c2285de3173","Type":"ContainerStarted","Data":"ac3374a449cbad41720e0476a29435259f384b34285215bffe05a51f16f1af9b"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.825128 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.828279 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bcz6w container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" start-of-body= Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.828350 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" podUID="fb20bbdd-0a13-4d07-8c4a-8c2285de3173" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.836628 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.848167 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.860501 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.862226 4774 scope.go:117] "RemoveContainer" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.864507 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.877682 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.882077 4774 scope.go:117] "RemoveContainer" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.883260 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e\": container with ID starting with 5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e not found: ID does not exist" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883289 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e"} err="failed to get container status \"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e\": rpc error: code = NotFound desc = could not find container \"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e\": container with ID starting with 5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883315 4774 scope.go:117] "RemoveContainer" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.883698 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f\": container with ID starting with 20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f not found: ID does not exist" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883722 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f"} err="failed to get container status \"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f\": rpc error: code = NotFound desc = could not find container \"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f\": container with ID starting with 20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883736 4774 scope.go:117] "RemoveContainer" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.884198 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89\": container with ID starting with 063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89 not found: ID does not exist" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.884215 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89"} err="failed to get container status \"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89\": rpc error: code = NotFound desc = could not find container \"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89\": container with ID starting with 063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.884227 4774 scope.go:117] "RemoveContainer" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.898803 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.899416 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" podStartSLOduration=1.8994039489999999 podStartE2EDuration="1.899403949s" podCreationTimestamp="2026-01-27 00:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:13:20.898364251 +0000 UTC m=+379.204141175" watchObservedRunningTime="2026-01-27 00:13:20.899403949 +0000 UTC m=+379.205180833" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.906315 4774 scope.go:117] "RemoveContainer" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.916040 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.920653 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.924042 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.933563 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.933679 4774 scope.go:117] "RemoveContainer" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.950709 4774 scope.go:117] "RemoveContainer" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.951137 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f\": container with ID starting with d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f not found: ID does not exist" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951166 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f"} err="failed to get container status \"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f\": rpc error: code = NotFound desc = could not find container \"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f\": container with ID starting with d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951192 4774 scope.go:117] "RemoveContainer" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.951563 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724\": container with ID starting with b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724 not found: ID does not exist" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951588 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724"} err="failed to get container status \"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724\": rpc error: code = NotFound desc = could not find container \"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724\": container with ID starting with b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951602 4774 scope.go:117] "RemoveContainer" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.952935 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532\": container with ID starting with 59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532 not found: ID does not exist" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.952961 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532"} err="failed to get container status \"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532\": rpc error: code = NotFound desc = could not find container \"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532\": container with ID starting with 59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.952973 4774 scope.go:117] "RemoveContainer" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.967976 4774 scope.go:117] "RemoveContainer" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.983324 4774 scope.go:117] "RemoveContainer" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.998226 4774 scope.go:117] "RemoveContainer" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.999229 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4\": container with ID starting with 4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4 not found: ID does not exist" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999260 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4"} err="failed to get container status \"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4\": rpc error: code = NotFound desc = could not find container \"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4\": container with ID starting with 4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999288 4774 scope.go:117] "RemoveContainer" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.999589 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0\": container with ID starting with b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0 not found: ID does not exist" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999609 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0"} err="failed to get container status \"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0\": rpc error: code = NotFound desc = could not find container \"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0\": container with ID starting with b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999621 4774 scope.go:117] "RemoveContainer" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.999812 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3\": container with ID starting with 817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3 not found: ID does not exist" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999835 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3"} err="failed to get container status \"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3\": rpc error: code = NotFound desc = could not find container \"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3\": container with ID starting with 817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999849 4774 scope.go:117] "RemoveContainer" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.016535 4774 scope.go:117] "RemoveContainer" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.039644 4774 scope.go:117] "RemoveContainer" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.065255 4774 scope.go:117] "RemoveContainer" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.066784 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b\": container with ID starting with e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b not found: ID does not exist" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.066882 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b"} err="failed to get container status \"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b\": rpc error: code = NotFound desc = could not find container \"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b\": container with ID starting with e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.066934 4774 scope.go:117] "RemoveContainer" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.067675 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c\": container with ID starting with d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c not found: ID does not exist" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.067714 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c"} err="failed to get container status \"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c\": rpc error: code = NotFound desc = could not find container \"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c\": container with ID starting with d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.067739 4774 scope.go:117] "RemoveContainer" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.068510 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9\": container with ID starting with 783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9 not found: ID does not exist" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.068573 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9"} err="failed to get container status \"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9\": rpc error: code = NotFound desc = could not find container \"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9\": container with ID starting with 783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9 not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.068635 4774 scope.go:117] "RemoveContainer" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.115918 4774 scope.go:117] "RemoveContainer" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.116696 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5\": container with ID starting with b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5 not found: ID does not exist" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.116776 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5"} err="failed to get container status \"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5\": rpc error: code = NotFound desc = could not find container \"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5\": container with ID starting with b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5 not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518099 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7bhhh"] Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518359 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518375 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518385 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518394 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518405 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518414 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518426 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518434 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518446 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518456 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518470 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518479 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518491 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518499 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518514 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518522 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518531 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518538 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518548 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518555 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518568 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518576 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518586 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518594 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518605 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518613 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518738 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518753 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518767 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518780 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518790 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.519740 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.522434 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.535234 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bhhh"] Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.560257 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-utilities\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.560432 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrssf\" (UniqueName: \"kubernetes.io/projected/0b4a869e-2e74-4226-b41c-c8a481a0728b-kube-api-access-qrssf\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.560484 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-catalog-content\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.662206 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrssf\" (UniqueName: \"kubernetes.io/projected/0b4a869e-2e74-4226-b41c-c8a481a0728b-kube-api-access-qrssf\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.662329 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-catalog-content\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.662421 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-utilities\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.663491 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-catalog-content\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.663555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-utilities\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.688029 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrssf\" (UniqueName: \"kubernetes.io/projected/0b4a869e-2e74-4226-b41c-c8a481a0728b-kube-api-access-qrssf\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.834202 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" event={"ID":"fb20bbdd-0a13-4d07-8c4a-8c2285de3173","Type":"ContainerStarted","Data":"64877aa3b7149ca58578844502bee7101605f38b4348d45d430ea5251007cc20"} Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.851813 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.858944 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.112323 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bhhh"] Jan 27 00:13:22 crc kubenswrapper[4774]: W0127 00:13:22.131382 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b4a869e_2e74_4226_b41c_c8a481a0728b.slice/crio-e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b WatchSource:0}: Error finding container e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b: Status 404 returned error can't find the container with id e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.363587 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" path="/var/lib/kubelet/pods/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.364534 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386f196b-c4bc-4fea-924b-c0487a352310" path="/var/lib/kubelet/pods/386f196b-c4bc-4fea-924b-c0487a352310/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.365169 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" path="/var/lib/kubelet/pods/687ff3b2-f773-482b-9bf7-1b6135b4d6ac/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.366386 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" path="/var/lib/kubelet/pods/6cad24dd-acc4-40e5-8380-e0d74be79921/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.367085 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" path="/var/lib/kubelet/pods/af002227-deb6-4a24-8ce5-051f93bc178b/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.887612 4774 generic.go:334] "Generic (PLEG): container finished" podID="0b4a869e-2e74-4226-b41c-c8a481a0728b" containerID="8b8aeff7b3ebdbaf9b163700840cc3ddd2df567ed32b3e95708874387def64dc" exitCode=0 Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.887687 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerDied","Data":"8b8aeff7b3ebdbaf9b163700840cc3ddd2df567ed32b3e95708874387def64dc"} Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.887774 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerStarted","Data":"e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b"} Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.921876 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.923132 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.929586 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.938465 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.985998 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.986089 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.986463 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.087790 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.087900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.087946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.088429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.088472 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.114501 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.248484 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.693214 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.897369 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerStarted","Data":"2c6dd8878a47022f56e27b5201cf3e35ff53f86100131049fb6f5a97d01e4b04"} Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.900136 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerStarted","Data":"51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842"} Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.900211 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerStarted","Data":"bc02ff041620d6abe4f9549d299ee2ebf620ffad6df25e35936cc68f50fc4bcd"} Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.931195 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tg7j5"] Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.932503 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.934536 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.948149 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg7j5"] Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.004827 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-utilities\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.012422 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcm9\" (UniqueName: \"kubernetes.io/projected/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-kube-api-access-sfcm9\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.012608 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-catalog-content\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.113506 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-utilities\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.113578 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcm9\" (UniqueName: \"kubernetes.io/projected/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-kube-api-access-sfcm9\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.113656 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-catalog-content\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.114256 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-catalog-content\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.114416 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-utilities\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.134124 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcm9\" (UniqueName: \"kubernetes.io/projected/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-kube-api-access-sfcm9\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.290660 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.531234 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg7j5"] Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.911736 4774 generic.go:334] "Generic (PLEG): container finished" podID="0b4a869e-2e74-4226-b41c-c8a481a0728b" containerID="2c6dd8878a47022f56e27b5201cf3e35ff53f86100131049fb6f5a97d01e4b04" exitCode=0 Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.911933 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerDied","Data":"2c6dd8878a47022f56e27b5201cf3e35ff53f86100131049fb6f5a97d01e4b04"} Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.915517 4774 generic.go:334] "Generic (PLEG): container finished" podID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerID="51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842" exitCode=0 Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.915723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842"} Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.917455 4774 generic.go:334] "Generic (PLEG): container finished" podID="8be8b5b9-e3bc-4236-90ca-3d3808fa39b4" containerID="70e5425f9e3c516bf5c6c0404244081e7c1f4b91fe2e253905d74ac101fc43ee" exitCode=0 Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.917514 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerDied","Data":"70e5425f9e3c516bf5c6c0404244081e7c1f4b91fe2e253905d74ac101fc43ee"} Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.917560 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerStarted","Data":"124200faa1093ffc5f82f012ae8756c1b9cb9df7d6a711ebf34b6c3c8643dfed"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.336538 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6krcw"] Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.337698 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.340803 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.345554 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6krcw"] Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.463527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-catalog-content\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.463981 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf96c\" (UniqueName: \"kubernetes.io/projected/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-kube-api-access-kf96c\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.464073 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-utilities\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.565957 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-utilities\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566067 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-catalog-content\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf96c\" (UniqueName: \"kubernetes.io/projected/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-kube-api-access-kf96c\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566560 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-catalog-content\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566592 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-utilities\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.585803 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf96c\" (UniqueName: \"kubernetes.io/projected/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-kube-api-access-kf96c\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.669332 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.926150 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerStarted","Data":"345b5c0316ebee074c748f3ef5140c0f4c008f93c214eaca618daf79b3106144"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.929640 4774 generic.go:334] "Generic (PLEG): container finished" podID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerID="20d5f60ef29061380fe8995d910d907f50edc601d69e30e44c129e980a103c38" exitCode=0 Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.929705 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"20d5f60ef29061380fe8995d910d907f50edc601d69e30e44c129e980a103c38"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.932302 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerStarted","Data":"33b37e04507546c38a3ecd1d6b7da9fa51f5f561ac6704c2db49282fbeb29bfe"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.943427 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7bhhh" podStartSLOduration=2.472101325 podStartE2EDuration="4.943407194s" podCreationTimestamp="2026-01-27 00:13:21 +0000 UTC" firstStartedPulling="2026-01-27 00:13:22.88987437 +0000 UTC m=+381.195651254" lastFinishedPulling="2026-01-27 00:13:25.361180229 +0000 UTC m=+383.666957123" observedRunningTime="2026-01-27 00:13:25.942160441 +0000 UTC m=+384.247937315" watchObservedRunningTime="2026-01-27 00:13:25.943407194 +0000 UTC m=+384.249184068" Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.137122 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6krcw"] Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.943195 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerStarted","Data":"ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.946548 4774 generic.go:334] "Generic (PLEG): container finished" podID="8be8b5b9-e3bc-4236-90ca-3d3808fa39b4" containerID="33b37e04507546c38a3ecd1d6b7da9fa51f5f561ac6704c2db49282fbeb29bfe" exitCode=0 Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.946622 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerDied","Data":"33b37e04507546c38a3ecd1d6b7da9fa51f5f561ac6704c2db49282fbeb29bfe"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.954700 4774 generic.go:334] "Generic (PLEG): container finished" podID="89dc2f6e-3f9e-4098-b5d4-ff9481de0824" containerID="14505b82dea48b6bf2692bc6368a4affbcd972462cba6ce0c5188462a9bf1e59" exitCode=0 Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.954786 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerDied","Data":"14505b82dea48b6bf2692bc6368a4affbcd972462cba6ce0c5188462a9bf1e59"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.954830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerStarted","Data":"107deb40c13230bebd0e789c080095686cbb51a5317eb7f65bc823e890ddc4f2"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.972073 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gb9l9" podStartSLOduration=3.521119382 podStartE2EDuration="4.972050868s" podCreationTimestamp="2026-01-27 00:13:22 +0000 UTC" firstStartedPulling="2026-01-27 00:13:24.917911245 +0000 UTC m=+383.223688129" lastFinishedPulling="2026-01-27 00:13:26.368842731 +0000 UTC m=+384.674619615" observedRunningTime="2026-01-27 00:13:26.968070031 +0000 UTC m=+385.273846925" watchObservedRunningTime="2026-01-27 00:13:26.972050868 +0000 UTC m=+385.277827772" Jan 27 00:13:27 crc kubenswrapper[4774]: I0127 00:13:27.962446 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerStarted","Data":"d0457685baa12dca44afe111dbf293b4a69ba1ffd872aa78e2ccee3ef0295d36"} Jan 27 00:13:27 crc kubenswrapper[4774]: I0127 00:13:27.967707 4774 generic.go:334] "Generic (PLEG): container finished" podID="89dc2f6e-3f9e-4098-b5d4-ff9481de0824" containerID="ab05f1624e10cf0666152c3a1a0f5ca887717e8fd98871d618caac5de240eaaf" exitCode=0 Jan 27 00:13:27 crc kubenswrapper[4774]: I0127 00:13:27.967823 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerDied","Data":"ab05f1624e10cf0666152c3a1a0f5ca887717e8fd98871d618caac5de240eaaf"} Jan 27 00:13:28 crc kubenswrapper[4774]: I0127 00:13:28.008754 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tg7j5" podStartSLOduration=2.558345416 podStartE2EDuration="5.008729155s" podCreationTimestamp="2026-01-27 00:13:23 +0000 UTC" firstStartedPulling="2026-01-27 00:13:24.921173342 +0000 UTC m=+383.226950226" lastFinishedPulling="2026-01-27 00:13:27.371557041 +0000 UTC m=+385.677333965" observedRunningTime="2026-01-27 00:13:27.987795396 +0000 UTC m=+386.293572280" watchObservedRunningTime="2026-01-27 00:13:28.008729155 +0000 UTC m=+386.314506039" Jan 27 00:13:28 crc kubenswrapper[4774]: I0127 00:13:28.977104 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerStarted","Data":"8aec29af7c4c2a86d9c2a6d7defd4b2644095488c482807de8e6b8d35bedcd14"} Jan 27 00:13:28 crc kubenswrapper[4774]: I0127 00:13:28.998041 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6krcw" podStartSLOduration=2.566818877 podStartE2EDuration="3.998021387s" podCreationTimestamp="2026-01-27 00:13:25 +0000 UTC" firstStartedPulling="2026-01-27 00:13:26.957330494 +0000 UTC m=+385.263107378" lastFinishedPulling="2026-01-27 00:13:28.388533004 +0000 UTC m=+386.694309888" observedRunningTime="2026-01-27 00:13:28.993939138 +0000 UTC m=+387.299716022" watchObservedRunningTime="2026-01-27 00:13:28.998021387 +0000 UTC m=+387.303798281" Jan 27 00:13:31 crc kubenswrapper[4774]: I0127 00:13:31.853165 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:31 crc kubenswrapper[4774]: I0127 00:13:31.854037 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:31 crc kubenswrapper[4774]: I0127 00:13:31.909297 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:32 crc kubenswrapper[4774]: I0127 00:13:32.037579 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:33 crc kubenswrapper[4774]: I0127 00:13:33.249471 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:33 crc kubenswrapper[4774]: I0127 00:13:33.250084 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:33 crc kubenswrapper[4774]: I0127 00:13:33.306146 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.055143 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.291111 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.291342 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.341066 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.349642 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.670198 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.670810 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.719067 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.346750 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.675446 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.675512 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.675563 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.676196 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.676258 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5" gracePeriod=600 Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.310501 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5" exitCode=0 Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.310683 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5"} Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.311580 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d"} Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.311612 4774 scope.go:117] "RemoveContainer" containerID="18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85" Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.612478 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" containerID="cri-o://3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387" gracePeriod=30 Jan 27 00:13:39 crc kubenswrapper[4774]: I0127 00:13:39.328011 4774 generic.go:334] "Generic (PLEG): container finished" podID="df626623-28b8-43a3-a567-f14b1e95075a" containerID="3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387" exitCode=0 Jan 27 00:13:39 crc kubenswrapper[4774]: I0127 00:13:39.328090 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerDied","Data":"3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387"} Jan 27 00:13:39 crc kubenswrapper[4774]: I0127 00:13:39.945109 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060021 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060105 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060377 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060413 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060493 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060523 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060568 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060611 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.061453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.067166 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.070444 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.070881 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.071061 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g" (OuterVolumeSpecName: "kube-api-access-sj86g") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "kube-api-access-sj86g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.082338 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.090835 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.106770 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162498 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162536 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162546 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162554 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162562 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162571 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162578 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.335545 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerDied","Data":"8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf"} Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.335609 4774 scope.go:117] "RemoveContainer" containerID="3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.335720 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.377139 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.383792 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:13:42 crc kubenswrapper[4774]: I0127 00:13:42.364282 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df626623-28b8-43a3-a567-f14b1e95075a" path="/var/lib/kubelet/pods/df626623-28b8-43a3-a567-f14b1e95075a/volumes" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.180343 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn"] Jan 27 00:15:00 crc kubenswrapper[4774]: E0127 00:15:00.184038 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.184274 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.184682 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.185748 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.188023 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.191044 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.195083 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn"] Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.294264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.294346 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.294453 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.396056 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.396315 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.396372 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.397544 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.409139 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.414398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.509044 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.718568 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn"] Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.831826 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" event={"ID":"384f6205-7bf2-47e1-909f-def30825754d","Type":"ContainerStarted","Data":"f107b6f277f3dc55072c6f38ed6703d35cd4fd5c24142d3d9e413372e36a70a2"} Jan 27 00:15:01 crc kubenswrapper[4774]: I0127 00:15:01.839077 4774 generic.go:334] "Generic (PLEG): container finished" podID="384f6205-7bf2-47e1-909f-def30825754d" containerID="904c5d3ff81fc19a34066ba9aa7e7fa056cbdc1458b4158be4afb1cf86947ed7" exitCode=0 Jan 27 00:15:01 crc kubenswrapper[4774]: I0127 00:15:01.839140 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" event={"ID":"384f6205-7bf2-47e1-909f-def30825754d","Type":"ContainerDied","Data":"904c5d3ff81fc19a34066ba9aa7e7fa056cbdc1458b4158be4afb1cf86947ed7"} Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.058919 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.229678 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"384f6205-7bf2-47e1-909f-def30825754d\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.230979 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"384f6205-7bf2-47e1-909f-def30825754d\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.231071 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"384f6205-7bf2-47e1-909f-def30825754d\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.231546 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume" (OuterVolumeSpecName: "config-volume") pod "384f6205-7bf2-47e1-909f-def30825754d" (UID: "384f6205-7bf2-47e1-909f-def30825754d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.231772 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.236194 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv" (OuterVolumeSpecName: "kube-api-access-nsgxv") pod "384f6205-7bf2-47e1-909f-def30825754d" (UID: "384f6205-7bf2-47e1-909f-def30825754d"). InnerVolumeSpecName "kube-api-access-nsgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.237063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "384f6205-7bf2-47e1-909f-def30825754d" (UID: "384f6205-7bf2-47e1-909f-def30825754d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.332294 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.332324 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") on node \"crc\" DevicePath \"\"" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.850599 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" event={"ID":"384f6205-7bf2-47e1-909f-def30825754d","Type":"ContainerDied","Data":"f107b6f277f3dc55072c6f38ed6703d35cd4fd5c24142d3d9e413372e36a70a2"} Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.850896 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f107b6f277f3dc55072c6f38ed6703d35cd4fd5c24142d3d9e413372e36a70a2" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.850924 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:16:06 crc kubenswrapper[4774]: I0127 00:16:06.674978 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:16:06 crc kubenswrapper[4774]: I0127 00:16:06.675600 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:16:36 crc kubenswrapper[4774]: I0127 00:16:36.675843 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:16:36 crc kubenswrapper[4774]: I0127 00:16:36.676461 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.675697 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676225 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676269 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676794 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676848 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d" gracePeriod=600 Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617027 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d" exitCode=0 Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617169 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d"} Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617465 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442"} Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617495 4774 scope.go:117] "RemoveContainer" containerID="447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5" Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.874546 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.875792 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" containerID="cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.875921 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" containerID="cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876070 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" containerID="cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876169 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876257 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" containerID="cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876340 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" containerID="cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876318 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" containerID="cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.937055 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" containerID="cri-o://86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" gracePeriod=30 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.086176 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.088780 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-acl-logging/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089436 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-controller/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089956 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" exitCode=0 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089990 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" exitCode=0 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089997 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" exitCode=0 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090018 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" exitCode=143 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090026 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" exitCode=143 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090037 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090102 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090118 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090130 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090144 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090180 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.092759 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/2.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.093439 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.093489 4774 generic.go:334] "Generic (PLEG): container finished" podID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" exitCode=2 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.093525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerDied","Data":"b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.094045 4774 scope.go:117] "RemoveContainer" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.094327 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mtz9l_openshift-multus(0abcf78e-9b05-4b89-94f3-4d3230886ce0)\"" pod="openshift-multus/multus-mtz9l" podUID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.148725 4774 scope.go:117] "RemoveContainer" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.242616 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-acl-logging/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.243147 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-controller/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.243902 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309054 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hxmlk"] Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309557 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309582 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309602 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309632 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309646 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309654 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309664 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309672 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309753 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309764 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309806 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309816 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309828 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kubecfg-setup" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309836 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kubecfg-setup" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309892 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309901 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309915 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309923 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309935 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384f6205-7bf2-47e1-909f-def30825754d" containerName="collect-profiles" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309943 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="384f6205-7bf2-47e1-909f-def30825754d" containerName="collect-profiles" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309984 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309992 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.310006 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310015 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310188 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310226 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310241 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310251 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310262 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310292 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310306 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310315 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310327 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310340 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310389 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310402 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="384f6205-7bf2-47e1-909f-def30825754d" containerName="collect-profiles" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.310568 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310578 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310796 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.310986 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310997 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.313714 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420031 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420114 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420177 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420205 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420243 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420275 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420300 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420323 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420348 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420379 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420424 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420449 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420477 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420533 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420564 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420589 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420625 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420644 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420888 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-etc-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420924 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-bin\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420950 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420976 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-netd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421000 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba74c484-716d-469a-9ac1-299eb234026a-ovn-node-metrics-cert\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421030 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-script-lib\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421054 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-log-socket\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421077 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-netns\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421098 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-slash\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421128 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-systemd-units\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421149 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-var-lib-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421171 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-ovn\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421194 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-kubelet\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421222 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-env-overrides\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421249 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smbvh\" (UniqueName: \"kubernetes.io/projected/ba74c484-716d-469a-9ac1-299eb234026a-kube-api-access-smbvh\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421287 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-node-log\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421327 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash" (OuterVolumeSpecName: "host-slash") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421472 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-systemd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-config\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421657 4774 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421677 4774 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422767 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422829 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422896 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422925 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log" (OuterVolumeSpecName: "node-log") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422918 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422996 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423030 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423672 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket" (OuterVolumeSpecName: "log-socket") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423773 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423807 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.425293 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.425346 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.425630 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.429612 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km" (OuterVolumeSpecName: "kube-api-access-l92km") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "kube-api-access-l92km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.431334 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.447823 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-node-log\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523721 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523759 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-node-log\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523850 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523897 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-systemd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524026 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-config\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-systemd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524073 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524204 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-etc-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524101 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-etc-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524348 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-bin\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524373 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524396 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-netd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba74c484-716d-469a-9ac1-299eb234026a-ovn-node-metrics-cert\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524480 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-script-lib\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524502 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-log-socket\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-netns\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524533 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-bin\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524565 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-slash\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524567 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-netd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524546 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-slash\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524638 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-netns\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524687 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-log-socket\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524787 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-systemd-units\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524838 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-var-lib-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524920 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-ovn\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524998 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-kubelet\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525043 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-config\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525057 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-env-overrides\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525128 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525197 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-ovn\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525513 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-systemd-units\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525557 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-kubelet\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525581 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-var-lib-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525725 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smbvh\" (UniqueName: \"kubernetes.io/projected/ba74c484-716d-469a-9ac1-299eb234026a-kube-api-access-smbvh\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525788 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-script-lib\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526072 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526108 4774 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526127 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526145 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526162 4774 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526181 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526199 4774 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526216 4774 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526233 4774 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526250 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526267 4774 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526283 4774 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526300 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526319 4774 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-env-overrides\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526335 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526414 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526437 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526456 4774 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.528475 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba74c484-716d-469a-9ac1-299eb234026a-ovn-node-metrics-cert\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.558321 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smbvh\" (UniqueName: \"kubernetes.io/projected/ba74c484-716d-469a-9ac1-299eb234026a-kube-api-access-smbvh\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.631153 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.104196 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba74c484-716d-469a-9ac1-299eb234026a" containerID="c78f160701dec7e9fe4b1c576bee0236f75bffca60ad65c0838e3abae4adc208" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.104283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerDied","Data":"c78f160701dec7e9fe4b1c576bee0236f75bffca60ad65c0838e3abae4adc208"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.104387 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"4d24f470b2834422d5b2e205a7ac192b100ed1a914ccb1a11e897bdb71b7b1f8"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.110710 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-acl-logging/0.log" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.112498 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-controller/0.log" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113615 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113649 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113667 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113750 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113772 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113937 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.114143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.114240 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113982 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.123836 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/2.log" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.147062 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.177698 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.185215 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.200290 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.236044 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.258934 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.274473 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.291224 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.312452 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.329352 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.354327 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.354944 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.355007 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} err="failed to get container status \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.355052 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.356486 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.356535 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} err="failed to get container status \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.356571 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.356939 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.356958 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} err="failed to get container status \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357029 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.357372 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357397 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} err="failed to get container status \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357411 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.357701 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357751 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} err="failed to get container status \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357780 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.358076 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358100 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} err="failed to get container status \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358113 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.358367 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358394 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} err="failed to get container status \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358407 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.358682 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358706 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} err="failed to get container status \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358720 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.359052 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359077 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} err="failed to get container status \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359093 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359448 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} err="failed to get container status \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359550 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360435 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} err="failed to get container status \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360459 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360824 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} err="failed to get container status \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360839 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361184 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} err="failed to get container status \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361216 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361684 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} err="failed to get container status \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361726 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362418 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} err="failed to get container status \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362443 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362774 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} err="failed to get container status \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362792 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.363066 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} err="failed to get container status \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.363091 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364147 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} err="failed to get container status \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364184 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364572 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} err="failed to get container status \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364599 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364948 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} err="failed to get container status \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364997 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.365279 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} err="failed to get container status \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.365313 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366027 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} err="failed to get container status \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366047 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366321 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} err="failed to get container status \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366360 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366663 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} err="failed to get container status \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366692 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366967 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" path="/var/lib/kubelet/pods/db881c9d-a960-48ae-93bf-d0ccd687e0b9/volumes" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.367011 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} err="failed to get container status \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.367190 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.368886 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} err="failed to get container status \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.368917 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.369337 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} err="failed to get container status \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133539 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"fe3b93208880432c7317d8b671d58aeab8649a43fa9b0515bd3c226984526703"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133932 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"755bee5634b393794a6e1807590e0b71ca56a97a4b8e1415956c704a7eca9105"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"bb72bf099ff429daba307aed2238a54bcabd3965341914ad0a47f9a484960668"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133961 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"a54be5b92d1355b2973911ba966b4b9d7385eba1caeb7cf0b62ac06a04f034e9"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133973 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"e745792bcc06dc6a565c7d5cf4c37c30d3e00d55170e55c0e03c8c151fb35750"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"0afd07e16c7708dcd690cb6d4fefe712540b9e15c6e7d5c11aeba439b5c047c3"} Jan 27 00:18:25 crc kubenswrapper[4774]: I0127 00:18:25.157495 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"a6873f3caf97359917bea9f3d3de8ea2283ad867a4c5bae70d19dd80f7d94cfb"} Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187385 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"a9d31913a209a867409b2d0a7680feb6dac5c275bcb22e87f08ec0743f366b48"} Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187897 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187919 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187931 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.224659 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" podStartSLOduration=7.224640103 podStartE2EDuration="7.224640103s" podCreationTimestamp="2026-01-27 00:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:18:28.220728829 +0000 UTC m=+686.526505723" watchObservedRunningTime="2026-01-27 00:18:28.224640103 +0000 UTC m=+686.530416987" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.224980 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.228452 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:32 crc kubenswrapper[4774]: I0127 00:18:32.364952 4774 scope.go:117] "RemoveContainer" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" Jan 27 00:18:32 crc kubenswrapper[4774]: E0127 00:18:32.366073 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mtz9l_openshift-multus(0abcf78e-9b05-4b89-94f3-4d3230886ce0)\"" pod="openshift-multus/multus-mtz9l" podUID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" Jan 27 00:18:44 crc kubenswrapper[4774]: I0127 00:18:44.358005 4774 scope.go:117] "RemoveContainer" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" Jan 27 00:18:45 crc kubenswrapper[4774]: I0127 00:18:45.315155 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/2.log" Jan 27 00:18:45 crc kubenswrapper[4774]: I0127 00:18:45.315706 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"958f1ed8b4448031ab68fded275acfa4629d5770bab597ec1754a1cd5bd04722"} Jan 27 00:18:51 crc kubenswrapper[4774]: I0127 00:18:51.665730 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:19:06 crc kubenswrapper[4774]: I0127 00:19:06.675751 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:19:06 crc kubenswrapper[4774]: I0127 00:19:06.677140 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.391907 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.394811 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gb9l9" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" containerID="cri-o://ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f" gracePeriod=30 Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.648195 4774 generic.go:334] "Generic (PLEG): container finished" podID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerID="ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f" exitCode=0 Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.648264 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f"} Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.805921 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.977463 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"36542624-2b85-40c0-a571-9ded4d2dcb9b\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.977572 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"36542624-2b85-40c0-a571-9ded4d2dcb9b\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.977640 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"36542624-2b85-40c0-a571-9ded4d2dcb9b\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.978981 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities" (OuterVolumeSpecName: "utilities") pod "36542624-2b85-40c0-a571-9ded4d2dcb9b" (UID: "36542624-2b85-40c0-a571-9ded4d2dcb9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.993771 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw" (OuterVolumeSpecName: "kube-api-access-wzbhw") pod "36542624-2b85-40c0-a571-9ded4d2dcb9b" (UID: "36542624-2b85-40c0-a571-9ded4d2dcb9b"). InnerVolumeSpecName "kube-api-access-wzbhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.001939 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36542624-2b85-40c0-a571-9ded4d2dcb9b" (UID: "36542624-2b85-40c0-a571-9ded4d2dcb9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.079976 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.080313 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.080383 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.659273 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"bc02ff041620d6abe4f9549d299ee2ebf620ffad6df25e35936cc68f50fc4bcd"} Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.659780 4774 scope.go:117] "RemoveContainer" containerID="ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.659377 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.705653 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.708398 4774 scope.go:117] "RemoveContainer" containerID="20d5f60ef29061380fe8995d910d907f50edc601d69e30e44c129e980a103c38" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.730131 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.766763 4774 scope.go:117] "RemoveContainer" containerID="51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842" Jan 27 00:19:32 crc kubenswrapper[4774]: I0127 00:19:32.369758 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" path="/var/lib/kubelet/pods/36542624-2b85-40c0-a571-9ded4d2dcb9b/volumes" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.526821 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2"] Jan 27 00:19:34 crc kubenswrapper[4774]: E0127 00:19:34.527476 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527492 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" Jan 27 00:19:34 crc kubenswrapper[4774]: E0127 00:19:34.527518 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-utilities" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527527 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-utilities" Jan 27 00:19:34 crc kubenswrapper[4774]: E0127 00:19:34.527543 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-content" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527551 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-content" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527670 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.528600 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.531025 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.540215 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2"] Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.557280 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.557849 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.557947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.659463 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.659517 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.659539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.660458 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.660571 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.679989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.851058 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:35 crc kubenswrapper[4774]: I0127 00:19:35.266773 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2"] Jan 27 00:19:35 crc kubenswrapper[4774]: I0127 00:19:35.684100 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerStarted","Data":"d8bfeab9be4b695cf6031430494c06a374a7a674a2bc6e25bc4e6797521a722b"} Jan 27 00:19:35 crc kubenswrapper[4774]: I0127 00:19:35.684454 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerStarted","Data":"97e1c6bd5cd72e557d94c68a32981b66a99aed46052345d6d84fff6b3f9a03d8"} Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.675582 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.675676 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.698263 4774 generic.go:334] "Generic (PLEG): container finished" podID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerID="d8bfeab9be4b695cf6031430494c06a374a7a674a2bc6e25bc4e6797521a722b" exitCode=0 Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.698331 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"d8bfeab9be4b695cf6031430494c06a374a7a674a2bc6e25bc4e6797521a722b"} Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.701090 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:19:38 crc kubenswrapper[4774]: I0127 00:19:38.716122 4774 generic.go:334] "Generic (PLEG): container finished" podID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerID="596d0d773d00494f782d77dbf041bb0e68fdbd48057eb27f130de14f4558ec47" exitCode=0 Jan 27 00:19:38 crc kubenswrapper[4774]: I0127 00:19:38.716281 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"596d0d773d00494f782d77dbf041bb0e68fdbd48057eb27f130de14f4558ec47"} Jan 27 00:19:39 crc kubenswrapper[4774]: I0127 00:19:39.725504 4774 generic.go:334] "Generic (PLEG): container finished" podID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerID="5e13ab57abb6cae526ec9bcd9a02622712c7d2f7fd61e820ecb36b6e98e5d7b1" exitCode=0 Jan 27 00:19:39 crc kubenswrapper[4774]: I0127 00:19:39.725592 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"5e13ab57abb6cae526ec9bcd9a02622712c7d2f7fd61e820ecb36b6e98e5d7b1"} Jan 27 00:19:40 crc kubenswrapper[4774]: I0127 00:19:40.960475 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.059199 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.059258 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.059340 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.061787 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle" (OuterVolumeSpecName: "bundle") pod "f2086af6-4ed5-4f66-bf01-aa661ba5a168" (UID: "f2086af6-4ed5-4f66-bf01-aa661ba5a168"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.068112 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj" (OuterVolumeSpecName: "kube-api-access-g27tj") pod "f2086af6-4ed5-4f66-bf01-aa661ba5a168" (UID: "f2086af6-4ed5-4f66-bf01-aa661ba5a168"). InnerVolumeSpecName "kube-api-access-g27tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.071477 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util" (OuterVolumeSpecName: "util") pod "f2086af6-4ed5-4f66-bf01-aa661ba5a168" (UID: "f2086af6-4ed5-4f66-bf01-aa661ba5a168"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.160877 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.160917 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.160929 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.738804 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"97e1c6bd5cd72e557d94c68a32981b66a99aed46052345d6d84fff6b3f9a03d8"} Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.739321 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97e1c6bd5cd72e557d94c68a32981b66a99aed46052345d6d84fff6b3f9a03d8" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.738949 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779381 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb"] Jan 27 00:19:42 crc kubenswrapper[4774]: E0127 00:19:42.779654 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="pull" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779671 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="pull" Jan 27 00:19:42 crc kubenswrapper[4774]: E0127 00:19:42.779688 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="extract" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779700 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="extract" Jan 27 00:19:42 crc kubenswrapper[4774]: E0127 00:19:42.779727 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="util" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779735 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="util" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779854 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="extract" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.780793 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.784366 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.805756 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb"] Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.882800 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.882905 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.882964 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.890008 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.891285 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.907907 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984287 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984378 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984425 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984503 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984618 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.985005 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.985099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.012943 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.062020 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.086373 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.086458 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.086486 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.087164 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.087336 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.105816 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.116884 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.220548 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.382003 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.474440 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.547591 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.548556 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.560939 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.694427 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.695107 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.695231 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.762534 4774 generic.go:334] "Generic (PLEG): container finished" podID="4c9c1faf-541d-4491-b37e-99909c74944b" containerID="44bf8961053363986c01a6554172bfd8b05751670ed90f16bf1e2f86b2ba58dc" exitCode=0 Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.762588 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"44bf8961053363986c01a6554172bfd8b05751670ed90f16bf1e2f86b2ba58dc"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.762649 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerStarted","Data":"23a340b31e207dc1319a830ec2ed78f64feba461f1f191f536b89271a9b624f9"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.764639 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerID="ee8f24b32a8f3f27163d6f15de8e84175a69f228c191d9c0adacc759c7ce0121" exitCode=0 Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.764666 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"ee8f24b32a8f3f27163d6f15de8e84175a69f228c191d9c0adacc759c7ce0121"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.764681 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerStarted","Data":"f686dbacce0862c41ca0b35de88c3a41d09cd938400978d49b9794435f651b3e"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.797137 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.797202 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.797229 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.798170 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.798271 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.830193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.861362 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.075598 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b"] Jan 27 00:19:44 crc kubenswrapper[4774]: W0127 00:19:44.089971 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3f5387_0b7c_4c6d_8aff_e293c038aafb.slice/crio-fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f WatchSource:0}: Error finding container fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f: Status 404 returned error can't find the container with id fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.777978 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerID="103d19fe0051a0aa6b54cfba0e223333a3f8830a2651e9586c3a424245aa9a74" exitCode=0 Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.778048 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"103d19fe0051a0aa6b54cfba0e223333a3f8830a2651e9586c3a424245aa9a74"} Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.778089 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerStarted","Data":"fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f"} Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.784576 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerID="6dcbc08c278885cd2711653b6cf428e3c4856ecf3d3fb60fd92a0d6210e72762" exitCode=0 Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.784664 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"6dcbc08c278885cd2711653b6cf428e3c4856ecf3d3fb60fd92a0d6210e72762"} Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.787552 4774 generic.go:334] "Generic (PLEG): container finished" podID="4c9c1faf-541d-4491-b37e-99909c74944b" containerID="92dbedab7e3a765202cdab5c278b77c503ff9fdad67f1ad130701e4b0d133e1e" exitCode=0 Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.787652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"92dbedab7e3a765202cdab5c278b77c503ff9fdad67f1ad130701e4b0d133e1e"} Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.791069 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerID="7cbb5b9beaea9330095805348572843bf7b2a5e4897bdf713b917fe46bc90ea9" exitCode=0 Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.791131 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"7cbb5b9beaea9330095805348572843bf7b2a5e4897bdf713b917fe46bc90ea9"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.799017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerStarted","Data":"27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.802894 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerID="826946182f64aca9f23e2c531367ec09b2c0c5d61341612c15408d7ac154875d" exitCode=0 Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.803066 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"826946182f64aca9f23e2c531367ec09b2c0c5d61341612c15408d7ac154875d"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.805665 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerID="ce083fb15a33bddd0ad4aad29742003d6de2d178017df06cbbf71d4c4d53f156" exitCode=0 Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.805765 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"ce083fb15a33bddd0ad4aad29742003d6de2d178017df06cbbf71d4c4d53f156"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.831951 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pdxh5" podStartSLOduration=2.384887124 podStartE2EDuration="4.831929866s" podCreationTimestamp="2026-01-27 00:19:42 +0000 UTC" firstStartedPulling="2026-01-27 00:19:43.76475625 +0000 UTC m=+762.070533124" lastFinishedPulling="2026-01-27 00:19:46.211798982 +0000 UTC m=+764.517575866" observedRunningTime="2026-01-27 00:19:46.829181753 +0000 UTC m=+765.134958637" watchObservedRunningTime="2026-01-27 00:19:46.831929866 +0000 UTC m=+765.137706750" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.449380 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq"] Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.450477 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.477682 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq"] Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.557734 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.557894 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.557976 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659512 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659630 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.660030 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.689369 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.766917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.257886 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.326017 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.370547 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372659 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372732 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372755 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372872 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.376463 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp" (OuterVolumeSpecName: "kube-api-access-gd4tp") pod "1b3f5387-0b7c-4c6d-8aff-e293c038aafb" (UID: "1b3f5387-0b7c-4c6d-8aff-e293c038aafb"). InnerVolumeSpecName "kube-api-access-gd4tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.377463 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle" (OuterVolumeSpecName: "bundle") pod "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" (UID: "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.377736 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle" (OuterVolumeSpecName: "bundle") pod "1b3f5387-0b7c-4c6d-8aff-e293c038aafb" (UID: "1b3f5387-0b7c-4c6d-8aff-e293c038aafb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.380877 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd" (OuterVolumeSpecName: "kube-api-access-5z7wd") pod "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" (UID: "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06"). InnerVolumeSpecName "kube-api-access-5z7wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.390385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util" (OuterVolumeSpecName: "util") pod "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" (UID: "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.395815 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util" (OuterVolumeSpecName: "util") pod "1b3f5387-0b7c-4c6d-8aff-e293c038aafb" (UID: "1b3f5387-0b7c-4c6d-8aff-e293c038aafb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498735 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498794 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498810 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498823 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498840 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.503949 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.551583 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq"] Jan 27 00:19:48 crc kubenswrapper[4774]: W0127 00:19:48.586127 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bdb298d_0c46_429d_b4c2_44d106881eb7.slice/crio-0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef WatchSource:0}: Error finding container 0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef: Status 404 returned error can't find the container with id 0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592377 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592576 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592593 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592602 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592610 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592620 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592626 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592639 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592645 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592653 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592658 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592668 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592673 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592762 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592775 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.593523 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.624652 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.706810 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.706877 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.706898 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808096 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808155 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808178 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808731 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808773 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.828025 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.846413 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.846451 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.846474 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.847903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerStarted","Data":"efb67e23acbfdf21e1128206db0796da92a2b8b3505fcf7184c374d562e0ebb6"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.847934 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerStarted","Data":"0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.850350 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"f686dbacce0862c41ca0b35de88c3a41d09cd938400978d49b9794435f651b3e"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.850396 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f686dbacce0862c41ca0b35de88c3a41d09cd938400978d49b9794435f651b3e" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.850447 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.907117 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.282436 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.857438 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerID="efb67e23acbfdf21e1128206db0796da92a2b8b3505fcf7184c374d562e0ebb6" exitCode=0 Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.857534 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"efb67e23acbfdf21e1128206db0796da92a2b8b3505fcf7184c374d562e0ebb6"} Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.859317 4774 generic.go:334] "Generic (PLEG): container finished" podID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerID="4c26c2e2994e8be6a1fd5545240fd4ba414fbf6378b30c5235cd400317e57ad6" exitCode=0 Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.859347 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"4c26c2e2994e8be6a1fd5545240fd4ba414fbf6378b30c5235cd400317e57ad6"} Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.859363 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerStarted","Data":"c7f9a533937e391d5a101f3230a6fdf47cd308c1e2082bc57db248e561eaab18"} Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.700973 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.702788 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.728594 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.767212 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.767295 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.767320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.868378 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.868429 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.868478 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.869136 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.869166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.911825 4774 generic.go:334] "Generic (PLEG): container finished" podID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerID="200e6d7e4a8ae08ab9f21528d73e613a734eef3b1ea5439197e326a0aac07a73" exitCode=0 Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.911889 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"200e6d7e4a8ae08ab9f21528d73e613a734eef3b1ea5439197e326a0aac07a73"} Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.929238 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.018386 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.362701 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.921157 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerStarted","Data":"9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180"} Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.927590 4774 generic.go:334] "Generic (PLEG): container finished" podID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerID="e02d219daaca4ec67fc454d718eec7db071fdd5266115161621d49cc1c997588" exitCode=0 Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.927640 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"e02d219daaca4ec67fc454d718eec7db071fdd5266115161621d49cc1c997588"} Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.927669 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerStarted","Data":"f277365d77be95b43a841265074b0061aae099b70be4de813c5361a232b01168"} Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.966220 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qdczk" podStartSLOduration=2.439468087 podStartE2EDuration="4.966197876s" podCreationTimestamp="2026-01-27 00:19:48 +0000 UTC" firstStartedPulling="2026-01-27 00:19:49.861753374 +0000 UTC m=+768.167530258" lastFinishedPulling="2026-01-27 00:19:52.388483163 +0000 UTC m=+770.694260047" observedRunningTime="2026-01-27 00:19:52.963277148 +0000 UTC m=+771.269054052" watchObservedRunningTime="2026-01-27 00:19:52.966197876 +0000 UTC m=+771.271974760" Jan 27 00:19:53 crc kubenswrapper[4774]: I0127 00:19:53.220985 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:53 crc kubenswrapper[4774]: I0127 00:19:53.221038 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:54 crc kubenswrapper[4774]: I0127 00:19:54.261337 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pdxh5" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" probeResult="failure" output=< Jan 27 00:19:54 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:19:54 crc kubenswrapper[4774]: > Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.399676 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.400943 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.404630 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.404921 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-sp9r2" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.405751 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.433791 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.522758 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqdl8\" (UniqueName: \"kubernetes.io/projected/82ad6e88-a32b-4f4f-9a96-66d10c58a7d9-kube-api-access-qqdl8\") pod \"obo-prometheus-operator-68bc856cb9-qzvj5\" (UID: \"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.532754 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.533803 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.536270 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-mbppc" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.542455 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.543565 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.544289 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.555163 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.563514 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.623879 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.623945 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.623984 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqdl8\" (UniqueName: \"kubernetes.io/projected/82ad6e88-a32b-4f4f-9a96-66d10c58a7d9-kube-api-access-qqdl8\") pod \"obo-prometheus-operator-68bc856cb9-qzvj5\" (UID: \"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.624283 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.624524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.662143 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqdl8\" (UniqueName: \"kubernetes.io/projected/82ad6e88-a32b-4f4f-9a96-66d10c58a7d9-kube-api-access-qqdl8\") pod \"obo-prometheus-operator-68bc856cb9-qzvj5\" (UID: \"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726228 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726325 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726374 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726404 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.732586 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.734372 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.735052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.739114 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.739313 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.789486 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fqps4"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.790593 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.797228 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.800823 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-6l2g4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.805615 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fqps4"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.854287 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.861545 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.930126 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2222c\" (UniqueName: \"kubernetes.io/projected/3247d37e-1277-411a-ad8b-ffcd6172206f-kube-api-access-2222c\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.930195 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3247d37e-1277-411a-ad8b-ffcd6172206f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.969493 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerStarted","Data":"51c589dd8edac479f82cbd8c35b089e741262f8274ee231b1ad43eb7751efc5f"} Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.986305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerStarted","Data":"6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb"} Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.031088 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2222c\" (UniqueName: \"kubernetes.io/projected/3247d37e-1277-411a-ad8b-ffcd6172206f-kube-api-access-2222c\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.031181 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3247d37e-1277-411a-ad8b-ffcd6172206f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.036756 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3247d37e-1277-411a-ad8b-ffcd6172206f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.068607 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2222c\" (UniqueName: \"kubernetes.io/projected/3247d37e-1277-411a-ad8b-ffcd6172206f-kube-api-access-2222c\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.098935 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b7pt6"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.099668 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.105069 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-r49fd" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.108586 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.119799 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b7pt6"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.134827 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9nqq\" (UniqueName: \"kubernetes.io/projected/96fc7bad-ed57-4110-afa8-9a6e5748c292-kube-api-access-z9nqq\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.134960 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96fc7bad-ed57-4110-afa8-9a6e5748c292-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.238342 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9nqq\" (UniqueName: \"kubernetes.io/projected/96fc7bad-ed57-4110-afa8-9a6e5748c292-kube-api-access-z9nqq\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.238406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96fc7bad-ed57-4110-afa8-9a6e5748c292-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.239627 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96fc7bad-ed57-4110-afa8-9a6e5748c292-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.303529 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xcbvz"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.304511 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.306197 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.306471 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-4gbp9" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.309292 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xcbvz"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.312751 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.319756 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9nqq\" (UniqueName: \"kubernetes.io/projected/96fc7bad-ed57-4110-afa8-9a6e5748c292-kube-api-access-z9nqq\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.341090 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzln8\" (UniqueName: \"kubernetes.io/projected/f2a5d99b-17f5-4a46-958b-ae997b57245e-kube-api-access-rzln8\") pod \"interconnect-operator-5bb49f789d-xcbvz\" (UID: \"f2a5d99b-17f5-4a46-958b-ae997b57245e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.389602 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.421947 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.443910 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzln8\" (UniqueName: \"kubernetes.io/projected/f2a5d99b-17f5-4a46-958b-ae997b57245e-kube-api-access-rzln8\") pod \"interconnect-operator-5bb49f789d-xcbvz\" (UID: \"f2a5d99b-17f5-4a46-958b-ae997b57245e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: W0127 00:19:56.444488 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ad6e88_a32b_4f4f_9a96_66d10c58a7d9.slice/crio-284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8 WatchSource:0}: Error finding container 284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8: Status 404 returned error can't find the container with id 284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8 Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.472701 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzln8\" (UniqueName: \"kubernetes.io/projected/f2a5d99b-17f5-4a46-958b-ae997b57245e-kube-api-access-rzln8\") pod \"interconnect-operator-5bb49f789d-xcbvz\" (UID: \"f2a5d99b-17f5-4a46-958b-ae997b57245e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.476576 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.599738 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fqps4"] Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.615429 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r"] Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.661790 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.996327 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" event={"ID":"4ac18775-726a-43da-a184-dfd1565544f1","Type":"ContainerStarted","Data":"710fda1e9451d191c352c52b5bb38bf519dc16b3bce39730f227bbf272a04907"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.998909 4774 generic.go:334] "Generic (PLEG): container finished" podID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerID="6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb" exitCode=0 Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.999075 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:57.001997 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" event={"ID":"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9","Type":"ContainerStarted","Data":"284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:57.003285 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" event={"ID":"dfee63d3-9a5d-46f6-b984-78d6a837e20c","Type":"ContainerStarted","Data":"09d1242fce25c6ddeeab10eb41618a1e4c826f0c9b1869609c276595691e4b15"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:57.004909 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" event={"ID":"3247d37e-1277-411a-ad8b-ffcd6172206f","Type":"ContainerStarted","Data":"6ef9d09a6906441a4b6dc94f41942b47e9cef322d2ab69022a11232682410f9e"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.021890 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerID="51c589dd8edac479f82cbd8c35b089e741262f8274ee231b1ad43eb7751efc5f" exitCode=0 Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.021947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"51c589dd8edac479f82cbd8c35b089e741262f8274ee231b1ad43eb7751efc5f"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.409798 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b7pt6"] Jan 27 00:19:58 crc kubenswrapper[4774]: W0127 00:19:58.452928 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96fc7bad_ed57_4110_afa8_9a6e5748c292.slice/crio-ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93 WatchSource:0}: Error finding container ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93: Status 404 returned error can't find the container with id ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93 Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.674987 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xcbvz"] Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.908240 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.908297 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.976056 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.046850 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerStarted","Data":"c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.057322 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" event={"ID":"f2a5d99b-17f5-4a46-958b-ae997b57245e","Type":"ContainerStarted","Data":"49b9392f4bdc8a90e41d7d787075c697016901a5f86ec0aeebdd4cad653eaeb0"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.066569 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" event={"ID":"96fc7bad-ed57-4110-afa8-9a6e5748c292","Type":"ContainerStarted","Data":"ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.076002 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9wztc" podStartSLOduration=2.394659352 podStartE2EDuration="8.075973951s" podCreationTimestamp="2026-01-27 00:19:51 +0000 UTC" firstStartedPulling="2026-01-27 00:19:52.942085061 +0000 UTC m=+771.247861945" lastFinishedPulling="2026-01-27 00:19:58.62339966 +0000 UTC m=+776.929176544" observedRunningTime="2026-01-27 00:19:59.072432127 +0000 UTC m=+777.378209021" watchObservedRunningTime="2026-01-27 00:19:59.075973951 +0000 UTC m=+777.381750855" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.077802 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerID="a3e4452fbf1853f438d267a232202d0745bc9c8697654b5a77a29ceb58fd6789" exitCode=0 Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.078606 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"a3e4452fbf1853f438d267a232202d0745bc9c8697654b5a77a29ceb58fd6789"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.148323 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.576632 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-688b5775f4-bvrcz"] Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.577507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.581287 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-c9bfw" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.581513 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.600441 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-688b5775f4-bvrcz"] Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.712107 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-apiservice-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.712154 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gszqw\" (UniqueName: \"kubernetes.io/projected/e6700f3e-4423-48fe-94ae-562483cf3a18-kube-api-access-gszqw\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.712217 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-webhook-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.813955 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-webhook-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.814081 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-apiservice-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.814111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gszqw\" (UniqueName: \"kubernetes.io/projected/e6700f3e-4423-48fe-94ae-562483cf3a18-kube-api-access-gszqw\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.822937 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-webhook-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.831727 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-apiservice-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.845673 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gszqw\" (UniqueName: \"kubernetes.io/projected/e6700f3e-4423-48fe-94ae-562483cf3a18-kube-api-access-gszqw\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.896211 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.261960 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-688b5775f4-bvrcz"] Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.537519 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.630367 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"4bdb298d-0c46-429d-b4c2-44d106881eb7\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.630488 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"4bdb298d-0c46-429d-b4c2-44d106881eb7\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.630549 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"4bdb298d-0c46-429d-b4c2-44d106881eb7\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.631745 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle" (OuterVolumeSpecName: "bundle") pod "4bdb298d-0c46-429d-b4c2-44d106881eb7" (UID: "4bdb298d-0c46-429d-b4c2-44d106881eb7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.638040 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p" (OuterVolumeSpecName: "kube-api-access-dw46p") pod "4bdb298d-0c46-429d-b4c2-44d106881eb7" (UID: "4bdb298d-0c46-429d-b4c2-44d106881eb7"). InnerVolumeSpecName "kube-api-access-dw46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.645784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util" (OuterVolumeSpecName: "util") pod "4bdb298d-0c46-429d-b4c2-44d106881eb7" (UID: "4bdb298d-0c46-429d-b4c2-44d106881eb7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.732641 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.732699 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.732715 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.116969 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" event={"ID":"e6700f3e-4423-48fe-94ae-562483cf3a18","Type":"ContainerStarted","Data":"494fd68295d72075193ae11fc25cf75b4b825a4ee078b8e85903808e0a46e289"} Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.119771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef"} Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.119792 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef" Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.119895 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:20:02 crc kubenswrapper[4774]: I0127 00:20:02.018880 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:02 crc kubenswrapper[4774]: I0127 00:20:02.019212 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:02 crc kubenswrapper[4774]: I0127 00:20:02.074030 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:03 crc kubenswrapper[4774]: I0127 00:20:03.294038 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:03 crc kubenswrapper[4774]: I0127 00:20:03.351187 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:04 crc kubenswrapper[4774]: I0127 00:20:04.297724 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:20:04 crc kubenswrapper[4774]: I0127 00:20:04.298387 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qdczk" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" containerID="cri-o://9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" gracePeriod=2 Jan 27 00:20:05 crc kubenswrapper[4774]: I0127 00:20:05.155833 4774 generic.go:334] "Generic (PLEG): container finished" podID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" exitCode=0 Jan 27 00:20:05 crc kubenswrapper[4774]: I0127 00:20:05.155896 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180"} Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.675716 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.675776 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.675819 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.676417 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.676479 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442" gracePeriod=600 Jan 27 00:20:07 crc kubenswrapper[4774]: I0127 00:20:07.170398 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442"} Jan 27 00:20:07 crc kubenswrapper[4774]: I0127 00:20:07.170480 4774 scope.go:117] "RemoveContainer" containerID="1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d" Jan 27 00:20:07 crc kubenswrapper[4774]: I0127 00:20:07.170390 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442" exitCode=0 Jan 27 00:20:08 crc kubenswrapper[4774]: I0127 00:20:08.682161 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:20:08 crc kubenswrapper[4774]: I0127 00:20:08.682741 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pdxh5" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" containerID="cri-o://27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" gracePeriod=2 Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.910412 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.911434 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.912073 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.912112 4774 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-qdczk" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:09 crc kubenswrapper[4774]: I0127 00:20:09.212662 4774 generic.go:334] "Generic (PLEG): container finished" podID="4c9c1faf-541d-4491-b37e-99909c74944b" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" exitCode=0 Jan 27 00:20:09 crc kubenswrapper[4774]: I0127 00:20:09.212699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3"} Jan 27 00:20:12 crc kubenswrapper[4774]: I0127 00:20:12.116750 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.085154 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.085438 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9wztc" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" containerID="cri-o://c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220" gracePeriod=2 Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.221953 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.222450 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.222685 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.222718 4774 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-pdxh5" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.250190 4774 generic.go:334] "Generic (PLEG): container finished" podID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerID="c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220" exitCode=0 Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.250259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220"} Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.979244 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw"] Jan 27 00:20:15 crc kubenswrapper[4774]: E0127 00:20:15.980095 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="util" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980129 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="util" Jan 27 00:20:15 crc kubenswrapper[4774]: E0127 00:20:15.980156 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="extract" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980162 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="extract" Jan 27 00:20:15 crc kubenswrapper[4774]: E0127 00:20:15.980170 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="pull" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980177 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="pull" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980286 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="extract" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980727 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.982667 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.983536 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-fnw7h" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.983981 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.008243 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw"] Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.093471 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de9ae8e-6adf-4a50-9482-0ca1d1691559-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.093570 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lfj\" (UniqueName: \"kubernetes.io/projected/6de9ae8e-6adf-4a50-9482-0ca1d1691559-kube-api-access-l8lfj\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.129126 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.129440 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:dc62889b883f597de91b5389cc52c84c607247d49a807693be2f688e4703dfc3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:e797cdb47beef40b04da7b6d645bca3dc32e6247003c45b56b38efd9e13bf01c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:093d2731ac848ed5fd57356b155a19d3bf7b8db96d95b09c5d0095e143f7254f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:7d662a120305e2528acc7e9142b770b5b6a7f4932ddfcadfa4ac953935124895,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:75465aabb0aa427a5c531a8fcde463f6d119afbcc618ebcbf6b7ee9bc8aad160,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:dc18c8d6a4a9a0a574a57cc5082c8a9b26023bd6d69b9732892d584c1dfe5070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:369729978cecdc13c99ef3d179f8eb8a450a4a0cb70b63c27a55a15d1710ba27,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:d8c7a61d147f62b204d5c5f16864386025393453c9a81ea327bbd25d7765d611,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:b4a6eb1cc118a4334b424614959d8b7f361ddd779b3a72690ca49b0a3f26d9b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:21d4fff670893ba4b7fbc528cd49f8b71c8281cede9ef84f0697065bb6a7fc50,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:12d9dbe297a1c3b9df671f21156992082bc483887d851fafe76e5d17321ff474,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:e65c37f04f6d76a0cbfe05edb3cddf6a8f14f859ee35cf3aebea8fcb991d2c19,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:48e4e178c6eeaa9d5dd77a591c185a311b4b4a5caadb7199d48463123e31dc9e,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2222c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-59bdc8b94-fqps4_openshift-operators(3247d37e-1277-411a-ad8b-ffcd6172206f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.130928 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podUID="3247d37e-1277-411a-ad8b-ffcd6172206f" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.195407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de9ae8e-6adf-4a50-9482-0ca1d1691559-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.195497 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lfj\" (UniqueName: \"kubernetes.io/projected/6de9ae8e-6adf-4a50-9482-0ca1d1691559-kube-api-access-l8lfj\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.196249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de9ae8e-6adf-4a50-9482-0ca1d1691559-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.221037 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lfj\" (UniqueName: \"kubernetes.io/projected/6de9ae8e-6adf-4a50-9482-0ca1d1691559-kube-api-access-l8lfj\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.278575 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c\\\"\"" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podUID="3247d37e-1277-411a-ad8b-ffcd6172206f" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.299639 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.612373 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.612665 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:interconnect-operator,Image:registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43,Command:[qdr-operator],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:qdr-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_QDROUTERD_IMAGE,Value:registry.redhat.io/amq7/amq-interconnect@sha256:31d87473fa684178a694f9ee331d3c80f2653f9533cb65c2a325752166a077e9,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:amq7-interconnect-operator.v1.10.20,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rzln8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod interconnect-operator-5bb49f789d-xcbvz_service-telemetry(f2a5d99b-17f5-4a46-958b-ae997b57245e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.617432 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" podUID="f2a5d99b-17f5-4a46-958b-ae997b57245e" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.289334 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43\\\"\"" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" podUID="f2a5d99b-17f5-4a46-958b-ae997b57245e" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.585163 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.585446 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqdl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-68bc856cb9-qzvj5_openshift-operators(82ad6e88-a32b-4f4f-9a96-66d10c58a7d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.587041 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" podUID="82ad6e88-a32b-4f4f-9a96-66d10c58a7d9" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.616362 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.618157 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717024 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"bae6d967-d19c-4ab9-a2e2-21292d93389f\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717591 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"4c9c1faf-541d-4491-b37e-99909c74944b\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717631 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"4c9c1faf-541d-4491-b37e-99909c74944b\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717666 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"bae6d967-d19c-4ab9-a2e2-21292d93389f\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717696 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"bae6d967-d19c-4ab9-a2e2-21292d93389f\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717801 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"4c9c1faf-541d-4491-b37e-99909c74944b\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.719904 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities" (OuterVolumeSpecName: "utilities") pod "4c9c1faf-541d-4491-b37e-99909c74944b" (UID: "4c9c1faf-541d-4491-b37e-99909c74944b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.720285 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities" (OuterVolumeSpecName: "utilities") pod "bae6d967-d19c-4ab9-a2e2-21292d93389f" (UID: "bae6d967-d19c-4ab9-a2e2-21292d93389f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.736654 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx" (OuterVolumeSpecName: "kube-api-access-d8bxx") pod "bae6d967-d19c-4ab9-a2e2-21292d93389f" (UID: "bae6d967-d19c-4ab9-a2e2-21292d93389f"). InnerVolumeSpecName "kube-api-access-d8bxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.738260 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd" (OuterVolumeSpecName: "kube-api-access-tfmdd") pod "4c9c1faf-541d-4491-b37e-99909c74944b" (UID: "4c9c1faf-541d-4491-b37e-99909c74944b"). InnerVolumeSpecName "kube-api-access-tfmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.785284 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bae6d967-d19c-4ab9-a2e2-21292d93389f" (UID: "bae6d967-d19c-4ab9-a2e2-21292d93389f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.819746 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820157 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820211 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820262 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820328 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.970515 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c9c1faf-541d-4491-b37e-99909c74944b" (UID: "4c9c1faf-541d-4491-b37e-99909c74944b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.022878 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.066136 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.099698 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw"] Jan 27 00:20:18 crc kubenswrapper[4774]: W0127 00:20:18.112540 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6de9ae8e_6adf_4a50_9482_0ca1d1691559.slice/crio-9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54 WatchSource:0}: Error finding container 9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54: Status 404 returned error can't find the container with id 9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54 Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.123534 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.123760 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.123787 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.128182 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities" (OuterVolumeSpecName: "utilities") pod "4afde62c-1f8e-4d6f-87ab-b4710b6c7158" (UID: "4afde62c-1f8e-4d6f-87ab-b4710b6c7158"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.136900 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t" (OuterVolumeSpecName: "kube-api-access-8vs8t") pod "4afde62c-1f8e-4d6f-87ab-b4710b6c7158" (UID: "4afde62c-1f8e-4d6f-87ab-b4710b6c7158"). InnerVolumeSpecName "kube-api-access-8vs8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.219724 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4afde62c-1f8e-4d6f-87ab-b4710b6c7158" (UID: "4afde62c-1f8e-4d6f-87ab-b4710b6c7158"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.225683 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.225902 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.225961 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.299417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"f277365d77be95b43a841265074b0061aae099b70be4de813c5361a232b01168"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.299551 4774 scope.go:117] "RemoveContainer" containerID="c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.299460 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.302597 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" event={"ID":"96fc7bad-ed57-4110-afa8-9a6e5748c292","Type":"ContainerStarted","Data":"3ab50ec43d30b6cacbebffd3612e5698f2534b69b0e20c1acd6ba5b645b76e20"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.303530 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.310641 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"c7f9a533937e391d5a101f3230a6fdf47cd308c1e2082bc57db248e561eaab18"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.310781 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.317573 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" event={"ID":"6de9ae8e-6adf-4a50-9482-0ca1d1691559","Type":"ContainerStarted","Data":"9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.326991 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" podStartSLOduration=3.146570146 podStartE2EDuration="22.326964242s" podCreationTimestamp="2026-01-27 00:19:56 +0000 UTC" firstStartedPulling="2026-01-27 00:19:58.462610948 +0000 UTC m=+776.768387832" lastFinishedPulling="2026-01-27 00:20:17.643005044 +0000 UTC m=+795.948781928" observedRunningTime="2026-01-27 00:20:18.323769646 +0000 UTC m=+796.629546530" watchObservedRunningTime="2026-01-27 00:20:18.326964242 +0000 UTC m=+796.632741126" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.329699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"23a340b31e207dc1319a830ec2ed78f64feba461f1f191f536b89271a9b624f9"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.329817 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.337585 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" event={"ID":"4ac18775-726a-43da-a184-dfd1565544f1","Type":"ContainerStarted","Data":"bee7ecc2ff4418719abb4e0308ad2db20bcdc6e3438cdb67c4b1f924f2d71375"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.340676 4774 scope.go:117] "RemoveContainer" containerID="6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.343174 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" event={"ID":"dfee63d3-9a5d-46f6-b984-78d6a837e20c","Type":"ContainerStarted","Data":"4600c5cb3c4f9b4561588557caae5830a0960b54703cbf258ac4f048be39f6b6"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.347910 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.350838 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" event={"ID":"e6700f3e-4423-48fe-94ae-562483cf3a18","Type":"ContainerStarted","Data":"b810bc6a18801ba49e5d9a6f3beaae9060508746bf13d60cff6840025efc02f3"} Jan 27 00:20:18 crc kubenswrapper[4774]: E0127 00:20:18.351793 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a\\\"\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" podUID="82ad6e88-a32b-4f4f-9a96-66d10c58a7d9" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.359944 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.377829 4774 scope.go:117] "RemoveContainer" containerID="e02d219daaca4ec67fc454d718eec7db071fdd5266115161621d49cc1c997588" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.384777 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.425646 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" podStartSLOduration=2.233365354 podStartE2EDuration="23.425628506s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.45067862 +0000 UTC m=+774.756455504" lastFinishedPulling="2026-01-27 00:20:17.642941772 +0000 UTC m=+795.948718656" observedRunningTime="2026-01-27 00:20:18.394387052 +0000 UTC m=+796.700163946" watchObservedRunningTime="2026-01-27 00:20:18.425628506 +0000 UTC m=+796.731405390" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.432337 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.433473 4774 scope.go:117] "RemoveContainer" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.436719 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.476580 4774 scope.go:117] "RemoveContainer" containerID="200e6d7e4a8ae08ab9f21528d73e613a734eef3b1ea5439197e326a0aac07a73" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.511747 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" podStartSLOduration=2.584897758 podStartE2EDuration="23.511724664s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.63085072 +0000 UTC m=+774.936627604" lastFinishedPulling="2026-01-27 00:20:17.557677606 +0000 UTC m=+795.863454510" observedRunningTime="2026-01-27 00:20:18.510555703 +0000 UTC m=+796.816332587" watchObservedRunningTime="2026-01-27 00:20:18.511724664 +0000 UTC m=+796.817501558" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.531807 4774 scope.go:117] "RemoveContainer" containerID="4c26c2e2994e8be6a1fd5545240fd4ba414fbf6378b30c5235cd400317e57ad6" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.607344 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" podStartSLOduration=2.255509023 podStartE2EDuration="19.607321986s" podCreationTimestamp="2026-01-27 00:19:59 +0000 UTC" firstStartedPulling="2026-01-27 00:20:00.281703058 +0000 UTC m=+778.587479942" lastFinishedPulling="2026-01-27 00:20:17.633516021 +0000 UTC m=+795.939292905" observedRunningTime="2026-01-27 00:20:18.60673339 +0000 UTC m=+796.912510284" watchObservedRunningTime="2026-01-27 00:20:18.607321986 +0000 UTC m=+796.913098880" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.629502 4774 scope.go:117] "RemoveContainer" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.671153 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.675020 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.692987 4774 scope.go:117] "RemoveContainer" containerID="92dbedab7e3a765202cdab5c278b77c503ff9fdad67f1ad130701e4b0d133e1e" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.720034 4774 scope.go:117] "RemoveContainer" containerID="44bf8961053363986c01a6554172bfd8b05751670ed90f16bf1e2f86b2ba58dc" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.436449 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438115 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438182 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438239 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438285 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438381 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438434 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438483 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438529 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438615 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438672 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438764 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438934 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.439005 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439052 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.439127 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439178 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.439272 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439324 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439516 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439581 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439629 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.441653 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444214 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444413 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444437 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444653 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444741 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444824 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444898 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.445223 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-z65l4" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.445523 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.453425 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579214 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579769 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579796 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579821 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/b4f320b3-4c9f-433a-943a-8f2934061b87-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579846 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580060 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580181 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580376 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580560 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580627 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580708 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682383 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682440 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682472 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682495 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682520 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682549 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682592 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682615 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/b4f320b3-4c9f-433a-943a-8f2934061b87-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682657 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682678 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682698 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682725 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682745 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684090 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684120 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684474 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684552 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684880 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.685353 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.685677 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.700874 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.703439 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.706918 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.708193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.718496 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.722783 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.740461 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/b4f320b3-4c9f-433a-943a-8f2934061b87-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.762214 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.219981 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.364352 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" path="/var/lib/kubelet/pods/4afde62c-1f8e-4d6f-87ab-b4710b6c7158/volumes" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.365384 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" path="/var/lib/kubelet/pods/4c9c1faf-541d-4491-b37e-99909c74944b/volumes" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.366160 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" path="/var/lib/kubelet/pods/bae6d967-d19c-4ab9-a2e2-21292d93389f/volumes" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.392353 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerStarted","Data":"3f5de3ffe8938d560ac5c34306502ab0706452284e8018d6fc6f6c64b1f944d9"} Jan 27 00:20:26 crc kubenswrapper[4774]: I0127 00:20:26.481064 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.492765 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" event={"ID":"f2a5d99b-17f5-4a46-958b-ae997b57245e","Type":"ContainerStarted","Data":"55e9c58d67bcf7565f4e2a76c9849b0567ed678bfcabf2a5096249212063644e"} Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.495851 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" event={"ID":"6de9ae8e-6adf-4a50-9482-0ca1d1691559","Type":"ContainerStarted","Data":"30176e87a0c7738e220da2d37b7d3b9a6b192f87f7b55eca9cac835e7b2d763e"} Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.523207 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" podStartSLOduration=2.9097625049999998 podStartE2EDuration="36.52318945s" podCreationTimestamp="2026-01-27 00:19:56 +0000 UTC" firstStartedPulling="2026-01-27 00:19:58.690528242 +0000 UTC m=+776.996305126" lastFinishedPulling="2026-01-27 00:20:32.303955187 +0000 UTC m=+810.609732071" observedRunningTime="2026-01-27 00:20:32.519573463 +0000 UTC m=+810.825350357" watchObservedRunningTime="2026-01-27 00:20:32.52318945 +0000 UTC m=+810.828966334" Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.553587 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" podStartSLOduration=7.113680386 podStartE2EDuration="17.553572941s" podCreationTimestamp="2026-01-27 00:20:15 +0000 UTC" firstStartedPulling="2026-01-27 00:20:18.118286312 +0000 UTC m=+796.424063196" lastFinishedPulling="2026-01-27 00:20:28.558178877 +0000 UTC m=+806.863955751" observedRunningTime="2026-01-27 00:20:32.547993873 +0000 UTC m=+810.853770767" watchObservedRunningTime="2026-01-27 00:20:32.553572941 +0000 UTC m=+810.859349825" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.401073 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-gg42r"] Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.402478 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.406241 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.408608 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.421033 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-gg42r"] Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.574988 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2jb\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-kube-api-access-xz2jb\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.575101 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.677444 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2jb\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-kube-api-access-xz2jb\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.677581 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.719738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.720469 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2jb\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-kube-api-access-xz2jb\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:36 crc kubenswrapper[4774]: I0127 00:20:36.018390 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:36 crc kubenswrapper[4774]: I0127 00:20:36.295278 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-gg42r"] Jan 27 00:20:36 crc kubenswrapper[4774]: I0127 00:20:36.522307 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" event={"ID":"32c71eef-359e-42d1-a9c8-d0f392ecabaa","Type":"ContainerStarted","Data":"3f0a8ca9eaa5f3e751d24842cda55b2d88f42fca268c52b199efd3ddd35b5457"} Jan 27 00:20:38 crc kubenswrapper[4774]: I0127 00:20:38.535990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerStarted","Data":"e1c28ef26536e4b801203ccfc64b75e994e7aa6d50dc91a3221cd5034177d536"} Jan 27 00:20:38 crc kubenswrapper[4774]: I0127 00:20:38.537325 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" event={"ID":"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9","Type":"ContainerStarted","Data":"a571534328c7c3da2d9655ac739f8026e17b47f3e16424be24dbd388ac1b7422"} Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.562006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" event={"ID":"3247d37e-1277-411a-ad8b-ffcd6172206f","Type":"ContainerStarted","Data":"e6d6567f12a4bc63a0ce4a43c029910c706c61db654ed91c528debe4b527127e"} Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.562909 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.566225 4774 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-fqps4 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/healthz\": dial tcp 10.217.0.48:8081: connect: connection refused" start-of-body= Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.566308 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podUID="3247d37e-1277-411a-ad8b-ffcd6172206f" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.48:8081/healthz\": dial tcp 10.217.0.48:8081: connect: connection refused" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.619443 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" podStartSLOduration=8.925443095 podStartE2EDuration="44.619422039s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.451029471 +0000 UTC m=+774.756806355" lastFinishedPulling="2026-01-27 00:20:32.145008425 +0000 UTC m=+810.450785299" observedRunningTime="2026-01-27 00:20:39.615103163 +0000 UTC m=+817.920880057" watchObservedRunningTime="2026-01-27 00:20:39.619422039 +0000 UTC m=+817.925198923" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.637692 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podStartSLOduration=2.128552186 podStartE2EDuration="44.637665686s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.631181149 +0000 UTC m=+774.936958023" lastFinishedPulling="2026-01-27 00:20:39.140294639 +0000 UTC m=+817.446071523" observedRunningTime="2026-01-27 00:20:39.63261657 +0000 UTC m=+817.938393484" watchObservedRunningTime="2026-01-27 00:20:39.637665686 +0000 UTC m=+817.943442580" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.792945 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.835627 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:40 crc kubenswrapper[4774]: I0127 00:20:40.570015 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:20:41 crc kubenswrapper[4774]: I0127 00:20:41.582342 4774 generic.go:334] "Generic (PLEG): container finished" podID="b4f320b3-4c9f-433a-943a-8f2934061b87" containerID="e1c28ef26536e4b801203ccfc64b75e994e7aa6d50dc91a3221cd5034177d536" exitCode=0 Jan 27 00:20:41 crc kubenswrapper[4774]: I0127 00:20:41.582437 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerDied","Data":"e1c28ef26536e4b801203ccfc64b75e994e7aa6d50dc91a3221cd5034177d536"} Jan 27 00:20:42 crc kubenswrapper[4774]: I0127 00:20:42.598848 4774 generic.go:334] "Generic (PLEG): container finished" podID="b4f320b3-4c9f-433a-943a-8f2934061b87" containerID="2cdca33b20fa2d3239d49076e22a71d3376abb0212bea8ffe215758cdb461050" exitCode=0 Jan 27 00:20:42 crc kubenswrapper[4774]: I0127 00:20:42.599027 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerDied","Data":"2cdca33b20fa2d3239d49076e22a71d3376abb0212bea8ffe215758cdb461050"} Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.623098 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6"] Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.624306 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.626395 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-knclw" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.640695 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6"] Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.809176 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.809668 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flxf9\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-kube-api-access-flxf9\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.910835 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.910909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flxf9\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-kube-api-access-flxf9\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.939907 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flxf9\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-kube-api-access-flxf9\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.955434 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:45 crc kubenswrapper[4774]: I0127 00:20:45.239540 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.385216 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6"] Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.657574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerStarted","Data":"e9c15ad3203348e76f2c44f83792b6fbe61c8d02f239c5791fb1ac2a48d30915"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.658203 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.660538 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" event={"ID":"32c71eef-359e-42d1-a9c8-d0f392ecabaa","Type":"ContainerStarted","Data":"a591c674af2565d2a8fcf2c7062d7c4e51e5dffac8cf0cf285e0834a15dbe116"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.660623 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.662314 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" event={"ID":"84e34f19-48c2-4096-83e7-95b9fd16a1e6","Type":"ContainerStarted","Data":"f17ccc67d5046a300871dda616003a877a1195a908d7a53ae8d2007bf5cf3931"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.662414 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" event={"ID":"84e34f19-48c2-4096-83e7-95b9fd16a1e6","Type":"ContainerStarted","Data":"d8c598429154924b660f5b84216ce367590333a4f519c51d7b7036a7d3777657"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.699470 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=17.524848111 podStartE2EDuration="29.699448283s" podCreationTimestamp="2026-01-27 00:20:19 +0000 UTC" firstStartedPulling="2026-01-27 00:20:20.235739005 +0000 UTC m=+798.541515889" lastFinishedPulling="2026-01-27 00:20:32.410339177 +0000 UTC m=+810.716116061" observedRunningTime="2026-01-27 00:20:48.698768535 +0000 UTC m=+827.004545419" watchObservedRunningTime="2026-01-27 00:20:48.699448283 +0000 UTC m=+827.005225187" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.713549 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" podStartSLOduration=4.713522429 podStartE2EDuration="4.713522429s" podCreationTimestamp="2026-01-27 00:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:20:48.711841654 +0000 UTC m=+827.017618538" watchObservedRunningTime="2026-01-27 00:20:48.713522429 +0000 UTC m=+827.019299313" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.742521 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" podStartSLOduration=1.8219873930000001 podStartE2EDuration="13.742497433s" podCreationTimestamp="2026-01-27 00:20:35 +0000 UTC" firstStartedPulling="2026-01-27 00:20:36.307744416 +0000 UTC m=+814.613521300" lastFinishedPulling="2026-01-27 00:20:48.228254456 +0000 UTC m=+826.534031340" observedRunningTime="2026-01-27 00:20:48.737109909 +0000 UTC m=+827.042886803" watchObservedRunningTime="2026-01-27 00:20:48.742497433 +0000 UTC m=+827.048274317" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.268565 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9mmc4"] Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.269362 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.271693 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pnb79" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.295933 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb98j\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-kube-api-access-sb98j\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.295996 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-bound-sa-token\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.296136 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9mmc4"] Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.397801 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb98j\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-kube-api-access-sb98j\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.397884 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-bound-sa-token\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.439714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-bound-sa-token\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.441194 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb98j\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-kube-api-access-sb98j\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.585979 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.061971 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9mmc4"] Jan 27 00:20:50 crc kubenswrapper[4774]: W0127 00:20:50.067667 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a5e3cb3_d23f_405b_bb88_18159ee24067.slice/crio-0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9 WatchSource:0}: Error finding container 0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9: Status 404 returned error can't find the container with id 0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9 Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.689576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" event={"ID":"0a5e3cb3-d23f-405b-bb88-18159ee24067","Type":"ContainerStarted","Data":"f6f2826e7ba0d1ce7cd4913ae053b4ee300ca2c5cd151a0ad737743c6a76ef60"} Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.689655 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" event={"ID":"0a5e3cb3-d23f-405b-bb88-18159ee24067","Type":"ContainerStarted","Data":"0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9"} Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.713774 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" podStartSLOduration=1.713753404 podStartE2EDuration="1.713753404s" podCreationTimestamp="2026-01-27 00:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:20:50.706347205 +0000 UTC m=+829.012124089" watchObservedRunningTime="2026-01-27 00:20:50.713753404 +0000 UTC m=+829.019530288" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.495328 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.496917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.502014 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.502205 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.502014 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.503055 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540295 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540420 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540471 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540513 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540574 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542588 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542887 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.545415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644721 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644789 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644833 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644853 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644925 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644979 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645017 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645290 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645364 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645641 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646056 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646120 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646248 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646336 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646442 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646752 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.650549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.651698 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.662167 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.812347 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:52 crc kubenswrapper[4774]: I0127 00:20:52.077657 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:20:52 crc kubenswrapper[4774]: W0127 00:20:52.103156 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod357826c8_1161_4d5b_9f9b_f1b432b59bb4.slice/crio-f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac WatchSource:0}: Error finding container f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac: Status 404 returned error can't find the container with id f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac Jan 27 00:20:52 crc kubenswrapper[4774]: I0127 00:20:52.707445 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerStarted","Data":"f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac"} Jan 27 00:20:56 crc kubenswrapper[4774]: I0127 00:20:56.026174 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:58 crc kubenswrapper[4774]: I0127 00:20:58.751174 4774 generic.go:334] "Generic (PLEG): container finished" podID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerID="9c4f05efee4df97958a36a2ceae5c08933e7191a90c105019b63116157393129" exitCode=0 Jan 27 00:20:58 crc kubenswrapper[4774]: I0127 00:20:58.751265 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerDied","Data":"9c4f05efee4df97958a36a2ceae5c08933e7191a90c105019b63116157393129"} Jan 27 00:20:59 crc kubenswrapper[4774]: I0127 00:20:59.760963 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerStarted","Data":"eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b"} Jan 27 00:20:59 crc kubenswrapper[4774]: I0127 00:20:59.811107 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=2.81807664 podStartE2EDuration="8.811064209s" podCreationTimestamp="2026-01-27 00:20:51 +0000 UTC" firstStartedPulling="2026-01-27 00:20:52.113360115 +0000 UTC m=+830.419137009" lastFinishedPulling="2026-01-27 00:20:58.106347694 +0000 UTC m=+836.412124578" observedRunningTime="2026-01-27 00:20:59.803449037 +0000 UTC m=+838.109226001" watchObservedRunningTime="2026-01-27 00:20:59.811064209 +0000 UTC m=+838.116841143" Jan 27 00:20:59 crc kubenswrapper[4774]: I0127 00:20:59.916501 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="b4f320b3-4c9f-433a-943a-8f2934061b87" containerName="elasticsearch" probeResult="failure" output=< Jan 27 00:20:59 crc kubenswrapper[4774]: {"timestamp": "2026-01-27T00:20:59+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 27 00:20:59 crc kubenswrapper[4774]: > Jan 27 00:21:01 crc kubenswrapper[4774]: I0127 00:21:01.571772 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:21:01 crc kubenswrapper[4774]: I0127 00:21:01.775447 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" containerID="cri-o://eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b" gracePeriod=30 Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.168382 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.171516 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.173521 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.173774 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.174161 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.201669 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.226935 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227051 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227105 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227139 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227177 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227209 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227276 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227326 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227363 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227406 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227440 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334309 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334379 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334434 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334463 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334508 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334534 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334561 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334605 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334631 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334673 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335495 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335975 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.336171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.336420 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.340721 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.341487 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.342918 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.352920 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.540980 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:04 crc kubenswrapper[4774]: I0127 00:21:04.563365 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 00:21:04 crc kubenswrapper[4774]: I0127 00:21:04.800478 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerStarted","Data":"5bbb584405fb7631a8a17fc2b018cadebd4020bea20a1f59c42abfddafe4d91f"} Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.086388 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.815784 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_357826c8-1161-4d5b-9f9b-f1b432b59bb4/docker-build/0.log" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.818761 4774 generic.go:334] "Generic (PLEG): container finished" podID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerID="eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b" exitCode=1 Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.818968 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerDied","Data":"eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b"} Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.821916 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerStarted","Data":"65bae08af99d67fd59ec59dbc59a548eb6b9f634c7a63b5b18d30048a6914234"} Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.965901 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_357826c8-1161-4d5b-9f9b-f1b432b59bb4/docker-build/0.log" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.966762 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.981882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.981958 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.981988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982070 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982066 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982130 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982157 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982496 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982849 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982995 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983075 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983739 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983893 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983924 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984023 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984068 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984322 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984427 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984714 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984837 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984852 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984887 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984896 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984906 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984916 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984924 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984980 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.992163 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.996081 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg" (OuterVolumeSpecName: "kube-api-access-wx9bg") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "kube-api-access-wx9bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.996117 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086567 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086603 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086615 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086627 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086638 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.829717 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_357826c8-1161-4d5b-9f9b-f1b432b59bb4/docker-build/0.log" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.830507 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerDied","Data":"f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac"} Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.830518 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.830582 4774 scope.go:117] "RemoveContainer" containerID="eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.854956 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.856469 4774 scope.go:117] "RemoveContainer" containerID="9c4f05efee4df97958a36a2ceae5c08933e7191a90c105019b63116157393129" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.862102 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:21:08 crc kubenswrapper[4774]: I0127 00:21:08.364077 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" path="/var/lib/kubelet/pods/357826c8-1161-4d5b-9f9b-f1b432b59bb4/volumes" Jan 27 00:21:12 crc kubenswrapper[4774]: I0127 00:21:12.875111 4774 generic.go:334] "Generic (PLEG): container finished" podID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerID="65bae08af99d67fd59ec59dbc59a548eb6b9f634c7a63b5b18d30048a6914234" exitCode=0 Jan 27 00:21:12 crc kubenswrapper[4774]: I0127 00:21:12.875196 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"65bae08af99d67fd59ec59dbc59a548eb6b9f634c7a63b5b18d30048a6914234"} Jan 27 00:21:13 crc kubenswrapper[4774]: I0127 00:21:13.893571 4774 generic.go:334] "Generic (PLEG): container finished" podID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerID="324a3bc382f8ead3e05849a9fd24d9f5c0bfb892f82bc846544ebd46627c5a36" exitCode=0 Jan 27 00:21:13 crc kubenswrapper[4774]: I0127 00:21:13.893642 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"324a3bc382f8ead3e05849a9fd24d9f5c0bfb892f82bc846544ebd46627c5a36"} Jan 27 00:21:13 crc kubenswrapper[4774]: I0127 00:21:13.944060 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_e97d63e3-09a4-4cd1-9019-b10dfb8d2d48/manage-dockerfile/0.log" Jan 27 00:21:14 crc kubenswrapper[4774]: I0127 00:21:14.905964 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerStarted","Data":"c6b52da603998873d5e158aa465c1e4ee9dee4ac321b833341a4872c06415c80"} Jan 27 00:21:14 crc kubenswrapper[4774]: I0127 00:21:14.955384 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=11.955364512 podStartE2EDuration="11.955364512s" podCreationTimestamp="2026-01-27 00:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:21:14.951400276 +0000 UTC m=+853.257177160" watchObservedRunningTime="2026-01-27 00:21:14.955364512 +0000 UTC m=+853.261141406" Jan 27 00:22:35 crc kubenswrapper[4774]: I0127 00:22:35.482631 4774 generic.go:334] "Generic (PLEG): container finished" podID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerID="c6b52da603998873d5e158aa465c1e4ee9dee4ac321b833341a4872c06415c80" exitCode=0 Jan 27 00:22:35 crc kubenswrapper[4774]: I0127 00:22:35.482659 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"c6b52da603998873d5e158aa465c1e4ee9dee4ac321b833341a4872c06415c80"} Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.675633 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.675701 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.797618 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925162 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925236 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925283 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925328 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925363 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925446 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925545 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925619 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925727 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925945 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926091 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926245 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926381 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926558 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926655 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927128 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927155 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927167 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927176 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927185 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.928628 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.933786 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.934762 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.935776 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf" (OuterVolumeSpecName: "kube-api-access-jh4cf") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "kube-api-access-jh4cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.966705 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029312 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029395 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029409 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029422 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029433 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.131710 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.234433 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.503372 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"5bbb584405fb7631a8a17fc2b018cadebd4020bea20a1f59c42abfddafe4d91f"} Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.503755 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bbb584405fb7631a8a17fc2b018cadebd4020bea20a1f59c42abfddafe4d91f" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.503518 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:22:39 crc kubenswrapper[4774]: I0127 00:22:39.012234 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:39 crc kubenswrapper[4774]: I0127 00:22:39.063184 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.791390 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792013 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="git-clone" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792026 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="git-clone" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792037 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792045 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792058 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792064 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792074 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792079 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792087 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792093 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792193 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792208 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.795880 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.796140 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.796329 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.796470 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.811973 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902649 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902692 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902720 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902748 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902793 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902812 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902834 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902875 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902893 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902915 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902929 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004564 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004622 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004709 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004734 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004764 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004788 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004828 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004877 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004963 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005225 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005279 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005416 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005705 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005747 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.006990 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.007257 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.007594 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.012734 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.015205 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.021944 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.111109 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.335729 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.542738 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerStarted","Data":"efbb54a015404b8967513a09f9917177f0d8854494fa4a5dbf69d8215e48a96a"} Jan 27 00:22:43 crc kubenswrapper[4774]: I0127 00:22:43.553723 4774 generic.go:334] "Generic (PLEG): container finished" podID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" exitCode=0 Jan 27 00:22:43 crc kubenswrapper[4774]: I0127 00:22:43.553784 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerDied","Data":"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f"} Jan 27 00:22:44 crc kubenswrapper[4774]: I0127 00:22:44.566081 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerStarted","Data":"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20"} Jan 27 00:22:44 crc kubenswrapper[4774]: I0127 00:22:44.604268 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.604240004 podStartE2EDuration="3.604240004s" podCreationTimestamp="2026-01-27 00:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:22:44.602731324 +0000 UTC m=+942.908508238" watchObservedRunningTime="2026-01-27 00:22:44.604240004 +0000 UTC m=+942.910016888" Jan 27 00:22:52 crc kubenswrapper[4774]: I0127 00:22:52.313372 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:52 crc kubenswrapper[4774]: I0127 00:22:52.314777 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" containerID="cri-o://365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" gracePeriod=30 Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.265151 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_186f6989-c0d9-41b8-9036-6f5f74966b06/docker-build/0.log" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.266260 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407018 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407173 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407256 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407243 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407299 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407334 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407367 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407281 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407423 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407471 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407525 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407550 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407942 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407958 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.408385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.408466 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.409029 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.409078 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.409468 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.414402 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.415747 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6" (OuterVolumeSpecName: "kube-api-access-kdtr6") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "kube-api-access-kdtr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.417732 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509528 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509571 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509583 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509596 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509611 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509623 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509750 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509763 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.605120 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.611664 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.643661 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_186f6989-c0d9-41b8-9036-6f5f74966b06/docker-build/0.log" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.644692 4774 generic.go:334] "Generic (PLEG): container finished" podID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" exitCode=1 Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.644807 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.644783 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerDied","Data":"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20"} Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.645094 4774 scope.go:117] "RemoveContainer" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.645023 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerDied","Data":"efbb54a015404b8967513a09f9917177f0d8854494fa4a5dbf69d8215e48a96a"} Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.746635 4774 scope.go:117] "RemoveContainer" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.779608 4774 scope.go:117] "RemoveContainer" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.780256 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20\": container with ID starting with 365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20 not found: ID does not exist" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.780362 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20"} err="failed to get container status \"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20\": rpc error: code = NotFound desc = could not find container \"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20\": container with ID starting with 365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20 not found: ID does not exist" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.780462 4774 scope.go:117] "RemoveContainer" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.780870 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f\": container with ID starting with 6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f not found: ID does not exist" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.780902 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f"} err="failed to get container status \"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f\": rpc error: code = NotFound desc = could not find container \"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f\": container with ID starting with 6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f not found: ID does not exist" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.845921 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.925524 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931097 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.931525 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="manage-dockerfile" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931560 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="manage-dockerfile" Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.931602 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931622 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931920 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.933549 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.937961 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.937988 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.938395 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.964433 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.012119 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.026216 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027203 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027488 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027651 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027736 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027799 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027831 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.028068 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.028122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130541 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130654 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130705 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130772 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130847 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130916 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130961 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131006 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131078 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131126 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131160 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131179 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131227 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131939 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.132029 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.132197 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.132522 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.133059 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.133134 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.136534 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.137985 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.155366 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.266093 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.372553 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" path="/var/lib/kubelet/pods/186f6989-c0d9-41b8-9036-6f5f74966b06/volumes" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.551843 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.659456 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerStarted","Data":"6ba529326fadbe0cbf460b5a864b9676a02a66194862dcf2709f9808afc41984"} Jan 27 00:22:55 crc kubenswrapper[4774]: I0127 00:22:55.675061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerStarted","Data":"9fa2ef5e46f34f2f8bee5eebee95b472ff3968fc4432383de2683e43e9110407"} Jan 27 00:22:56 crc kubenswrapper[4774]: I0127 00:22:56.686986 4774 generic.go:334] "Generic (PLEG): container finished" podID="1c7b5d79-5dff-4a72-9864-22416e852110" containerID="9fa2ef5e46f34f2f8bee5eebee95b472ff3968fc4432383de2683e43e9110407" exitCode=0 Jan 27 00:22:56 crc kubenswrapper[4774]: I0127 00:22:56.687113 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"9fa2ef5e46f34f2f8bee5eebee95b472ff3968fc4432383de2683e43e9110407"} Jan 27 00:22:57 crc kubenswrapper[4774]: I0127 00:22:57.696359 4774 generic.go:334] "Generic (PLEG): container finished" podID="1c7b5d79-5dff-4a72-9864-22416e852110" containerID="e38e1b3fb869e2dfbe9321a3bb8c8c6137628317c6b252437553f0df41c5c826" exitCode=0 Jan 27 00:22:57 crc kubenswrapper[4774]: I0127 00:22:57.696416 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"e38e1b3fb869e2dfbe9321a3bb8c8c6137628317c6b252437553f0df41c5c826"} Jan 27 00:22:57 crc kubenswrapper[4774]: I0127 00:22:57.755193 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_1c7b5d79-5dff-4a72-9864-22416e852110/manage-dockerfile/0.log" Jan 27 00:22:58 crc kubenswrapper[4774]: I0127 00:22:58.706427 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerStarted","Data":"b39e5887c85163fcc961ded078df14a493b8d525c609add8e875af954134fc02"} Jan 27 00:22:58 crc kubenswrapper[4774]: I0127 00:22:58.744722 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.74468799 podStartE2EDuration="5.74468799s" podCreationTimestamp="2026-01-27 00:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:22:58.739460319 +0000 UTC m=+957.045237213" watchObservedRunningTime="2026-01-27 00:22:58.74468799 +0000 UTC m=+957.050464914" Jan 27 00:23:06 crc kubenswrapper[4774]: I0127 00:23:06.675690 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:23:06 crc kubenswrapper[4774]: I0127 00:23:06.676516 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.675798 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.676930 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.677015 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.678138 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.678282 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29" gracePeriod=600 Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210182 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29" exitCode=0 Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210878 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29"} Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210908 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb"} Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210926 4774 scope.go:117] "RemoveContainer" containerID="b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442" Jan 27 00:24:06 crc kubenswrapper[4774]: I0127 00:24:06.478517 4774 generic.go:334] "Generic (PLEG): container finished" podID="1c7b5d79-5dff-4a72-9864-22416e852110" containerID="b39e5887c85163fcc961ded078df14a493b8d525c609add8e875af954134fc02" exitCode=0 Jan 27 00:24:06 crc kubenswrapper[4774]: I0127 00:24:06.479145 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"b39e5887c85163fcc961ded078df14a493b8d525c609add8e875af954134fc02"} Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.824220 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.890541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.890940 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.891294 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.891543 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.891845 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.892764 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.893403 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.893698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.894034 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.894698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895739 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.896234 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895020 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895620 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895669 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.897499 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.898656 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.898712 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.899926 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900182 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900460 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900641 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900818 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.901061 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.907562 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.909211 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.912158 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.915174 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b" (OuterVolumeSpecName: "kube-api-access-69k7b") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "kube-api-access-69k7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002757 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002803 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002819 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002834 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.075658 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.103330 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.510372 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"6ba529326fadbe0cbf460b5a864b9676a02a66194862dcf2709f9808afc41984"} Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.510802 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba529326fadbe0cbf460b5a864b9676a02a66194862dcf2709f9808afc41984" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.510953 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:24:09 crc kubenswrapper[4774]: I0127 00:24:09.978456 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:10 crc kubenswrapper[4774]: I0127 00:24:10.039639 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.507860 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:12 crc kubenswrapper[4774]: E0127 00:24:12.508538 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="docker-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508553 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="docker-build" Jan 27 00:24:12 crc kubenswrapper[4774]: E0127 00:24:12.508585 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="git-clone" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508594 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="git-clone" Jan 27 00:24:12 crc kubenswrapper[4774]: E0127 00:24:12.508604 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="manage-dockerfile" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508611 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="manage-dockerfile" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508736 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="docker-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.509536 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512196 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512372 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512492 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512539 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.533209 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580495 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580583 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580612 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580638 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580759 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580998 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581049 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581130 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.682970 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683106 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683242 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683244 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683320 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683354 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683660 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683742 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683853 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684017 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684652 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684563 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684513 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684901 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.685039 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.685318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.691523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.691641 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.705591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.827636 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:13 crc kubenswrapper[4774]: I0127 00:24:13.136949 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:13 crc kubenswrapper[4774]: I0127 00:24:13.573957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerStarted","Data":"f77c18044669e0ee98035472ea754e01d9c30dbaba522e3d30ba4cdb3c4735ff"} Jan 27 00:24:14 crc kubenswrapper[4774]: I0127 00:24:14.588590 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" exitCode=0 Jan 27 00:24:14 crc kubenswrapper[4774]: I0127 00:24:14.588752 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerDied","Data":"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9"} Jan 27 00:24:15 crc kubenswrapper[4774]: I0127 00:24:15.604320 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerStarted","Data":"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb"} Jan 27 00:24:15 crc kubenswrapper[4774]: I0127 00:24:15.630505 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.630481305 podStartE2EDuration="3.630481305s" podCreationTimestamp="2026-01-27 00:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:24:15.627277179 +0000 UTC m=+1033.933054073" watchObservedRunningTime="2026-01-27 00:24:15.630481305 +0000 UTC m=+1033.936258209" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.035562 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.036336 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" containerID="cri-o://5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" gracePeriod=30 Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.417648 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_ba507aea-37a7-4fb3-b2f1-64fb8966d9a4/docker-build/0.log" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.418795 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470078 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470153 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470237 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470304 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470341 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470401 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470510 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470553 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470665 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470772 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470818 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.471195 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.471406 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.471434 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.472670 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.472809 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.472931 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.473088 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.473135 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.477661 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.478152 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh" (OuterVolumeSpecName: "kube-api-access-4qthh") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "kube-api-access-4qthh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.479252 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573255 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573289 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573298 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573312 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573321 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573329 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573339 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573349 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.579172 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.615989 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.671798 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_ba507aea-37a7-4fb3-b2f1-64fb8966d9a4/docker-build/0.log" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673041 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" exitCode=1 Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673106 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerDied","Data":"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb"} Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673154 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerDied","Data":"f77c18044669e0ee98035472ea754e01d9c30dbaba522e3d30ba4cdb3c4735ff"} Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673162 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673184 4774 scope.go:117] "RemoveContainer" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.674660 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.674712 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.726547 4774 scope.go:117] "RemoveContainer" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.736292 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.744843 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.785607 4774 scope.go:117] "RemoveContainer" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" Jan 27 00:24:23 crc kubenswrapper[4774]: E0127 00:24:23.786416 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb\": container with ID starting with 5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb not found: ID does not exist" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.786492 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb"} err="failed to get container status \"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb\": rpc error: code = NotFound desc = could not find container \"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb\": container with ID starting with 5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb not found: ID does not exist" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.786541 4774 scope.go:117] "RemoveContainer" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" Jan 27 00:24:23 crc kubenswrapper[4774]: E0127 00:24:23.787941 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9\": container with ID starting with 4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9 not found: ID does not exist" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.788019 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9"} err="failed to get container status \"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9\": rpc error: code = NotFound desc = could not find container \"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9\": container with ID starting with 4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9 not found: ID does not exist" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.371482 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" path="/var/lib/kubelet/pods/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4/volumes" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.845920 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Jan 27 00:24:24 crc kubenswrapper[4774]: E0127 00:24:24.846575 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.846604 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" Jan 27 00:24:24 crc kubenswrapper[4774]: E0127 00:24:24.846638 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="manage-dockerfile" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.846652 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="manage-dockerfile" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.846831 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.848319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.852073 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.852337 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.853441 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.853689 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.878070 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894424 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894503 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894675 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894761 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894810 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894912 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894967 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895021 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895160 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895261 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895302 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998124 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998237 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998302 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998394 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998426 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998444 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998465 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998746 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998832 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998935 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.999230 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.999276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.999673 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.000497 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.000691 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.001046 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.010102 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.010579 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.026646 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.163640 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.453200 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.692457 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerStarted","Data":"95377822472fd85dfa0453210cab1aec6065d108dc6943b9c636d085246845d4"} Jan 27 00:24:26 crc kubenswrapper[4774]: I0127 00:24:26.706024 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerStarted","Data":"6438ea88db110cce5cbdfae5af7c90136af6a8c7be2b9b1644fc72bd6a2df7bd"} Jan 27 00:24:27 crc kubenswrapper[4774]: I0127 00:24:27.716282 4774 generic.go:334] "Generic (PLEG): container finished" podID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerID="6438ea88db110cce5cbdfae5af7c90136af6a8c7be2b9b1644fc72bd6a2df7bd" exitCode=0 Jan 27 00:24:27 crc kubenswrapper[4774]: I0127 00:24:27.716326 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"6438ea88db110cce5cbdfae5af7c90136af6a8c7be2b9b1644fc72bd6a2df7bd"} Jan 27 00:24:28 crc kubenswrapper[4774]: I0127 00:24:28.727122 4774 generic.go:334] "Generic (PLEG): container finished" podID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerID="6b54eca9456d1e4d144051ab9ec59ce55311d9899f1cb227c509f4950fa46689" exitCode=0 Jan 27 00:24:28 crc kubenswrapper[4774]: I0127 00:24:28.727195 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"6b54eca9456d1e4d144051ab9ec59ce55311d9899f1cb227c509f4950fa46689"} Jan 27 00:24:28 crc kubenswrapper[4774]: I0127 00:24:28.771229 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_62f2cb11-10c8-4570-a3f6-3e869ea9dfd4/manage-dockerfile/0.log" Jan 27 00:24:29 crc kubenswrapper[4774]: I0127 00:24:29.741487 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerStarted","Data":"12e30af316e67a99c16a4887b96ef7142665aad739f15328a784cb874cc31ed0"} Jan 27 00:24:29 crc kubenswrapper[4774]: I0127 00:24:29.784504 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.78446784 podStartE2EDuration="5.78446784s" podCreationTimestamp="2026-01-27 00:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:24:29.783771751 +0000 UTC m=+1048.089548675" watchObservedRunningTime="2026-01-27 00:24:29.78446784 +0000 UTC m=+1048.090244794" Jan 27 00:26:06 crc kubenswrapper[4774]: I0127 00:26:06.675317 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:26:06 crc kubenswrapper[4774]: I0127 00:26:06.675956 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:26:36 crc kubenswrapper[4774]: I0127 00:26:36.675449 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:26:36 crc kubenswrapper[4774]: I0127 00:26:36.676265 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.675711 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.676756 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.676842 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.678008 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.678127 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb" gracePeriod=600 Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.976066 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb" exitCode=0 Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.976133 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb"} Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.977048 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5"} Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.977096 4774 scope.go:117] "RemoveContainer" containerID="9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29" Jan 27 00:27:43 crc kubenswrapper[4774]: I0127 00:27:43.288337 4774 generic.go:334] "Generic (PLEG): container finished" podID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerID="12e30af316e67a99c16a4887b96ef7142665aad739f15328a784cb874cc31ed0" exitCode=0 Jan 27 00:27:43 crc kubenswrapper[4774]: I0127 00:27:43.288406 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"12e30af316e67a99c16a4887b96ef7142665aad739f15328a784cb874cc31ed0"} Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.673177 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.831318 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832170 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832213 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832245 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832280 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832317 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832342 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832453 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832470 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833165 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833199 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833247 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833320 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833248 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833443 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833492 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834360 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833537 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833559 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834398 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834507 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834538 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.840468 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.842463 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9" (OuterVolumeSpecName: "kube-api-access-4blp9") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "kube-api-access-4blp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.843970 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.852370 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.936936 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.936995 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937018 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937040 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937057 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937074 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.191219 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.241778 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.312090 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"95377822472fd85dfa0453210cab1aec6065d108dc6943b9c636d085246845d4"} Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.312158 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.312163 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95377822472fd85dfa0453210cab1aec6065d108dc6943b9c636d085246845d4" Jan 27 00:27:48 crc kubenswrapper[4774]: I0127 00:27:48.005741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:48 crc kubenswrapper[4774]: I0127 00:27:48.097182 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.670805 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:49 crc kubenswrapper[4774]: E0127 00:27:49.672476 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="docker-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.672607 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="docker-build" Jan 27 00:27:49 crc kubenswrapper[4774]: E0127 00:27:49.672676 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="git-clone" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.672835 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="git-clone" Jan 27 00:27:49 crc kubenswrapper[4774]: E0127 00:27:49.672923 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="manage-dockerfile" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.672980 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="manage-dockerfile" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.673157 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="docker-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.674450 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.677403 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.677843 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.677897 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.678115 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.697797 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828676 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828736 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828771 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828787 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828806 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828939 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828996 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829028 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829051 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829267 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829340 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829395 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931244 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931341 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931404 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931498 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931528 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931593 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931787 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931835 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932320 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932442 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932482 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932520 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932696 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.933147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.933441 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.933742 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.934347 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.941231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.943854 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.956764 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:50 crc kubenswrapper[4774]: I0127 00:27:50.001956 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:50 crc kubenswrapper[4774]: I0127 00:27:50.321908 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:50 crc kubenswrapper[4774]: I0127 00:27:50.370046 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerStarted","Data":"ea252ad661071f08d5d4dbb63efce080e36285b3984e1fe7d27a8a1b1c69df27"} Jan 27 00:27:51 crc kubenswrapper[4774]: I0127 00:27:51.380326 4774 generic.go:334] "Generic (PLEG): container finished" podID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" exitCode=0 Jan 27 00:27:51 crc kubenswrapper[4774]: I0127 00:27:51.380399 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerDied","Data":"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f"} Jan 27 00:27:52 crc kubenswrapper[4774]: I0127 00:27:52.391723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerStarted","Data":"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11"} Jan 27 00:27:52 crc kubenswrapper[4774]: I0127 00:27:52.438485 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.438457909 podStartE2EDuration="3.438457909s" podCreationTimestamp="2026-01-27 00:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:27:52.435378446 +0000 UTC m=+1250.741155330" watchObservedRunningTime="2026-01-27 00:27:52.438457909 +0000 UTC m=+1250.744234823" Jan 27 00:27:59 crc kubenswrapper[4774]: I0127 00:27:59.977777 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:59 crc kubenswrapper[4774]: I0127 00:27:59.979125 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" containerID="cri-o://fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" gracePeriod=30 Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.306565 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_5ec0c529-6bbd-41cf-bc74-f8a226f562ee/docker-build/0.log" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.307527 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408842 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408883 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408913 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409049 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409174 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409265 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409482 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409542 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409771 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409839 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409913 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.410324 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.410675 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.410722 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.411375 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412501 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412517 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412716 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412854 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.422784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt" (OuterVolumeSpecName: "kube-api-access-2xxrt") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "kube-api-access-2xxrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.422800 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.425315 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.473129 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_5ec0c529-6bbd-41cf-bc74-f8a226f562ee/docker-build/0.log" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474416 4774 generic.go:334] "Generic (PLEG): container finished" podID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" exitCode=1 Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474467 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerDied","Data":"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11"} Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474542 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerDied","Data":"ea252ad661071f08d5d4dbb63efce080e36285b3984e1fe7d27a8a1b1c69df27"} Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474663 4774 scope.go:117] "RemoveContainer" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.475001 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.498434 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.504253 4774 scope.go:117] "RemoveContainer" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513297 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513331 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513345 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513358 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513370 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513381 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513392 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513404 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513566 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.536372 4774 scope.go:117] "RemoveContainer" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" Jan 27 00:28:00 crc kubenswrapper[4774]: E0127 00:28:00.537161 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11\": container with ID starting with fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11 not found: ID does not exist" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.537238 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11"} err="failed to get container status \"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11\": rpc error: code = NotFound desc = could not find container \"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11\": container with ID starting with fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11 not found: ID does not exist" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.537283 4774 scope.go:117] "RemoveContainer" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" Jan 27 00:28:00 crc kubenswrapper[4774]: E0127 00:28:00.537806 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f\": container with ID starting with f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f not found: ID does not exist" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.537934 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f"} err="failed to get container status \"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f\": rpc error: code = NotFound desc = could not find container \"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f\": container with ID starting with f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f not found: ID does not exist" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.986260 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.021984 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.120977 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.134666 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.672009 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: E0127 00:28:01.672650 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="manage-dockerfile" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.672737 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="manage-dockerfile" Jan 27 00:28:01 crc kubenswrapper[4774]: E0127 00:28:01.672834 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.672982 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.673172 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.674350 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.678820 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.679244 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.679344 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.679404 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.711391 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.735899 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.735986 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736065 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736109 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736147 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736207 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736236 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736267 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736326 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736359 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736410 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.836908 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837180 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837279 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837459 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837607 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837692 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837767 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837852 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838471 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838187 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838086 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838644 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838707 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838892 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.839162 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.839161 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.841338 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.842277 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.857324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.995488 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:02 crc kubenswrapper[4774]: I0127 00:28:02.273540 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Jan 27 00:28:02 crc kubenswrapper[4774]: I0127 00:28:02.372154 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" path="/var/lib/kubelet/pods/5ec0c529-6bbd-41cf-bc74-f8a226f562ee/volumes" Jan 27 00:28:02 crc kubenswrapper[4774]: I0127 00:28:02.501300 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerStarted","Data":"f4695ee39043ee1cf7334009731d6730fe0de886ad8fee5d4726ff3874944e3a"} Jan 27 00:28:03 crc kubenswrapper[4774]: I0127 00:28:03.509161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerStarted","Data":"e54a1121c6dba1bbac9e3c48e316c05cc45c49c651516fbd46a91ccf1b04c0cd"} Jan 27 00:28:04 crc kubenswrapper[4774]: I0127 00:28:04.519926 4774 generic.go:334] "Generic (PLEG): container finished" podID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerID="e54a1121c6dba1bbac9e3c48e316c05cc45c49c651516fbd46a91ccf1b04c0cd" exitCode=0 Jan 27 00:28:04 crc kubenswrapper[4774]: I0127 00:28:04.520069 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"e54a1121c6dba1bbac9e3c48e316c05cc45c49c651516fbd46a91ccf1b04c0cd"} Jan 27 00:28:05 crc kubenswrapper[4774]: I0127 00:28:05.531467 4774 generic.go:334] "Generic (PLEG): container finished" podID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerID="9be9db062f94ba7560ebf270e6cdf2ea9fba53b2de3e472a109511b2d455e78e" exitCode=0 Jan 27 00:28:05 crc kubenswrapper[4774]: I0127 00:28:05.531562 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"9be9db062f94ba7560ebf270e6cdf2ea9fba53b2de3e472a109511b2d455e78e"} Jan 27 00:28:05 crc kubenswrapper[4774]: I0127 00:28:05.572007 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_335eb417-b9c1-42a3-8725-2f18e5dab7a8/manage-dockerfile/0.log" Jan 27 00:28:06 crc kubenswrapper[4774]: I0127 00:28:06.543778 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerStarted","Data":"22ca06acc03323344bb1ef113c9898343cd315ad78e6e01547677009e369b46d"} Jan 27 00:28:06 crc kubenswrapper[4774]: I0127 00:28:06.595225 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=5.595194673 podStartE2EDuration="5.595194673s" podCreationTimestamp="2026-01-27 00:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:28:06.585508873 +0000 UTC m=+1264.891285847" watchObservedRunningTime="2026-01-27 00:28:06.595194673 +0000 UTC m=+1264.900971577" Jan 27 00:28:54 crc kubenswrapper[4774]: I0127 00:28:54.957386 4774 generic.go:334] "Generic (PLEG): container finished" podID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerID="22ca06acc03323344bb1ef113c9898343cd315ad78e6e01547677009e369b46d" exitCode=0 Jan 27 00:28:54 crc kubenswrapper[4774]: I0127 00:28:54.957488 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"22ca06acc03323344bb1ef113c9898343cd315ad78e6e01547677009e369b46d"} Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.351562 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.452655 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453198 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453276 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453309 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453390 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453424 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453450 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453475 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453531 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453542 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453576 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453664 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453713 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454057 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454187 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454776 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454847 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.455539 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.455750 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.455869 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.461006 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.461023 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25" (OuterVolumeSpecName: "kube-api-access-nzn25") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "kube-api-access-nzn25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.461065 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555893 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555931 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555943 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555953 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555962 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555971 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555980 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555989 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.556000 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.592175 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.658154 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.978741 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"f4695ee39043ee1cf7334009731d6730fe0de886ad8fee5d4726ff3874944e3a"} Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.978788 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4695ee39043ee1cf7334009731d6730fe0de886ad8fee5d4726ff3874944e3a" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.978918 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:57 crc kubenswrapper[4774]: I0127 00:28:57.345921 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:57 crc kubenswrapper[4774]: I0127 00:28:57.370069 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.932854 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:00 crc kubenswrapper[4774]: E0127 00:29:00.933652 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="manage-dockerfile" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933670 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="manage-dockerfile" Jan 27 00:29:00 crc kubenswrapper[4774]: E0127 00:29:00.933694 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="docker-build" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933703 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="docker-build" Jan 27 00:29:00 crc kubenswrapper[4774]: E0127 00:29:00.933725 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="git-clone" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933735 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="git-clone" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933915 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="docker-build" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.935054 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.937699 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.937910 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.937924 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.938168 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.958112 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083406 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083460 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083506 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083754 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083970 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084014 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084058 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084150 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084371 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185510 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185626 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185670 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185702 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185741 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185779 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185936 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186314 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186695 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186907 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187040 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187241 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187427 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187553 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187536 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.188015 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.188181 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.193537 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.201270 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.207412 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.256261 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.529952 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:02 crc kubenswrapper[4774]: I0127 00:29:02.025442 4774 generic.go:334] "Generic (PLEG): container finished" podID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerID="4f2d7297d4d28dd848f59b341d688f7920adc8f241b7e1da6ad4079f86054ae7" exitCode=0 Jan 27 00:29:02 crc kubenswrapper[4774]: I0127 00:29:02.025535 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerDied","Data":"4f2d7297d4d28dd848f59b341d688f7920adc8f241b7e1da6ad4079f86054ae7"} Jan 27 00:29:02 crc kubenswrapper[4774]: I0127 00:29:02.026000 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerStarted","Data":"6400856724c89fbcf80eb83da2af95703f59cc0a5ae533256599b067df3217f6"} Jan 27 00:29:03 crc kubenswrapper[4774]: I0127 00:29:03.037552 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerStarted","Data":"1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955"} Jan 27 00:29:03 crc kubenswrapper[4774]: I0127 00:29:03.068495 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.068450684 podStartE2EDuration="3.068450684s" podCreationTimestamp="2026-01-27 00:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:29:03.062208786 +0000 UTC m=+1321.367985660" watchObservedRunningTime="2026-01-27 00:29:03.068450684 +0000 UTC m=+1321.374227578" Jan 27 00:29:11 crc kubenswrapper[4774]: I0127 00:29:11.724354 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:11 crc kubenswrapper[4774]: I0127 00:29:11.726404 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" containerID="cri-o://1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955" gracePeriod=30 Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.127561 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e6b5f484-b9d4-4133-8d1f-af31e71ea317/docker-build/0.log" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129110 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e6b5f484-b9d4-4133-8d1f-af31e71ea317/docker-build/0.log" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129143 4774 generic.go:334] "Generic (PLEG): container finished" podID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerID="1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955" exitCode=1 Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129186 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerDied","Data":"1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955"} Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129234 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerDied","Data":"6400856724c89fbcf80eb83da2af95703f59cc0a5ae533256599b067df3217f6"} Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129249 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6400856724c89fbcf80eb83da2af95703f59cc0a5ae533256599b067df3217f6" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.130479 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269628 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269741 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269796 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269854 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269914 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269963 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270016 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270039 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270152 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270181 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270203 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270225 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270530 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270617 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.271170 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.272046 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.272345 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.272353 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.273480 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.282542 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.283196 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.285294 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w" (OuterVolumeSpecName: "kube-api-access-4ms8w") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "kube-api-access-4ms8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.351770 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371499 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371543 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371563 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371580 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371600 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371617 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371636 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371654 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371672 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371690 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371706 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.686929 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.776495 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.140625 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.209202 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.216479 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.298479 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: E0127 00:29:13.298901 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="manage-dockerfile" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.298929 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="manage-dockerfile" Jan 27 00:29:13 crc kubenswrapper[4774]: E0127 00:29:13.298949 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.298960 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.299184 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.300412 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.303124 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.303444 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.303671 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.306458 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.326166 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.487989 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488043 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488074 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488093 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488197 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488218 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488269 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488289 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488346 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488372 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488392 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590156 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590226 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590253 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590372 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590391 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590411 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590438 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590486 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590517 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590543 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590566 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591296 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591401 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591500 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591511 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591538 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591721 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591839 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591905 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.592600 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.597716 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.597715 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.622261 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.921531 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:14 crc kubenswrapper[4774]: I0127 00:29:14.185660 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Jan 27 00:29:14 crc kubenswrapper[4774]: I0127 00:29:14.374560 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" path="/var/lib/kubelet/pods/e6b5f484-b9d4-4133-8d1f-af31e71ea317/volumes" Jan 27 00:29:15 crc kubenswrapper[4774]: I0127 00:29:15.157761 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerStarted","Data":"49dbef974c4cda05528d3933f9e63b2acb525fc6f9b443e92b13b25b2daad9a3"} Jan 27 00:29:15 crc kubenswrapper[4774]: I0127 00:29:15.158301 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerStarted","Data":"8cd691b05592e7b3c264b7fbed06d48d81cd99fb31ca0c955cfed3fd066caee3"} Jan 27 00:29:16 crc kubenswrapper[4774]: I0127 00:29:16.165803 4774 generic.go:334] "Generic (PLEG): container finished" podID="50bd722f-c392-4f78-99af-2009d9b35de2" containerID="49dbef974c4cda05528d3933f9e63b2acb525fc6f9b443e92b13b25b2daad9a3" exitCode=0 Jan 27 00:29:16 crc kubenswrapper[4774]: I0127 00:29:16.165878 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"49dbef974c4cda05528d3933f9e63b2acb525fc6f9b443e92b13b25b2daad9a3"} Jan 27 00:29:17 crc kubenswrapper[4774]: I0127 00:29:17.177089 4774 generic.go:334] "Generic (PLEG): container finished" podID="50bd722f-c392-4f78-99af-2009d9b35de2" containerID="70ff28ac8a42b280187899cd540772bb0368aa9b3ed60593bb29316ad8ccb06a" exitCode=0 Jan 27 00:29:17 crc kubenswrapper[4774]: I0127 00:29:17.177216 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"70ff28ac8a42b280187899cd540772bb0368aa9b3ed60593bb29316ad8ccb06a"} Jan 27 00:29:17 crc kubenswrapper[4774]: I0127 00:29:17.230677 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_50bd722f-c392-4f78-99af-2009d9b35de2/manage-dockerfile/0.log" Jan 27 00:29:18 crc kubenswrapper[4774]: I0127 00:29:18.194384 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerStarted","Data":"6a4457f6719b8c78a9bdd9efd2a65239dabc2a80fae44a0533533901adf4c519"} Jan 27 00:29:18 crc kubenswrapper[4774]: I0127 00:29:18.241415 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.241201682 podStartE2EDuration="5.241201682s" podCreationTimestamp="2026-01-27 00:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:29:18.235738025 +0000 UTC m=+1336.541514919" watchObservedRunningTime="2026-01-27 00:29:18.241201682 +0000 UTC m=+1336.546978566" Jan 27 00:29:36 crc kubenswrapper[4774]: I0127 00:29:36.675087 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:29:36 crc kubenswrapper[4774]: I0127 00:29:36.675766 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.736764 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.739583 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.753690 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.902938 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.903025 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.903792 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.004731 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.004784 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.004827 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.005253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.005415 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.031323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.057807 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.307840 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.481605 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerStarted","Data":"bc478ee9fcc61b8b2a27262557b35effaf576b0e163905e01aa0bfa4d5e05938"} Jan 27 00:29:53 crc kubenswrapper[4774]: I0127 00:29:53.488853 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" exitCode=0 Jan 27 00:29:53 crc kubenswrapper[4774]: I0127 00:29:53.488971 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e"} Jan 27 00:29:53 crc kubenswrapper[4774]: I0127 00:29:53.491064 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:29:55 crc kubenswrapper[4774]: I0127 00:29:55.505259 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" exitCode=0 Jan 27 00:29:55 crc kubenswrapper[4774]: I0127 00:29:55.505391 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090"} Jan 27 00:29:56 crc kubenswrapper[4774]: I0127 00:29:56.513253 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerStarted","Data":"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0"} Jan 27 00:29:56 crc kubenswrapper[4774]: I0127 00:29:56.534682 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-672r9" podStartSLOduration=3.092441609 podStartE2EDuration="5.53466119s" podCreationTimestamp="2026-01-27 00:29:51 +0000 UTC" firstStartedPulling="2026-01-27 00:29:53.490782947 +0000 UTC m=+1371.796559831" lastFinishedPulling="2026-01-27 00:29:55.933002528 +0000 UTC m=+1374.238779412" observedRunningTime="2026-01-27 00:29:56.531952627 +0000 UTC m=+1374.837729541" watchObservedRunningTime="2026-01-27 00:29:56.53466119 +0000 UTC m=+1374.840438094" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.168435 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v"] Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.169835 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.176092 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.176713 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.201889 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v"] Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.321642 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.321733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.321971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.423337 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.423413 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.423947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.424617 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.431547 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.451409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.511208 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.725431 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v"] Jan 27 00:30:00 crc kubenswrapper[4774]: W0127 00:30:00.729149 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd04202_19c2_4d82_9a03_ddd3ddbc35d0.slice/crio-14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3 WatchSource:0}: Error finding container 14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3: Status 404 returned error can't find the container with id 14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3 Jan 27 00:30:01 crc kubenswrapper[4774]: I0127 00:30:01.555985 4774 generic.go:334] "Generic (PLEG): container finished" podID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerID="950a86b020502de0870835a037b1b740c89163f25bd9f03d78753bef170c3ab0" exitCode=0 Jan 27 00:30:01 crc kubenswrapper[4774]: I0127 00:30:01.556404 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" event={"ID":"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0","Type":"ContainerDied","Data":"950a86b020502de0870835a037b1b740c89163f25bd9f03d78753bef170c3ab0"} Jan 27 00:30:01 crc kubenswrapper[4774]: I0127 00:30:01.556438 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" event={"ID":"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0","Type":"ContainerStarted","Data":"14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3"} Jan 27 00:30:01 crc kubenswrapper[4774]: E0127 00:30:01.580636 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd04202_19c2_4d82_9a03_ddd3ddbc35d0.slice/crio-conmon-950a86b020502de0870835a037b1b740c89163f25bd9f03d78753bef170c3ab0.scope\": RecentStats: unable to find data in memory cache]" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.058823 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.058926 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.827461 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.964158 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.964328 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.964393 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.965382 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" (UID: "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.977696 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" (UID: "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.978599 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb" (OuterVolumeSpecName: "kube-api-access-p44fb") pod "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" (UID: "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0"). InnerVolumeSpecName "kube-api-access-p44fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.066973 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.067033 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.067048 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.109233 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-672r9" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" probeResult="failure" output=< Jan 27 00:30:03 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:30:03 crc kubenswrapper[4774]: > Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.582505 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" event={"ID":"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0","Type":"ContainerDied","Data":"14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3"} Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.583004 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.582561 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:06 crc kubenswrapper[4774]: I0127 00:30:06.675883 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:30:06 crc kubenswrapper[4774]: I0127 00:30:06.676377 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:30:08 crc kubenswrapper[4774]: I0127 00:30:08.631620 4774 generic.go:334] "Generic (PLEG): container finished" podID="50bd722f-c392-4f78-99af-2009d9b35de2" containerID="6a4457f6719b8c78a9bdd9efd2a65239dabc2a80fae44a0533533901adf4c519" exitCode=0 Jan 27 00:30:08 crc kubenswrapper[4774]: I0127 00:30:08.631689 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"6a4457f6719b8c78a9bdd9efd2a65239dabc2a80fae44a0533533901adf4c519"} Jan 27 00:30:09 crc kubenswrapper[4774]: I0127 00:30:09.958733 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077699 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077771 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077810 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077852 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077897 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077918 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078025 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078061 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078093 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078113 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078142 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078196 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078700 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078709 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.079203 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.080013 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.080060 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.080143 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086363 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086392 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086406 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086417 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086429 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086442 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086453 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.087103 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t" (OuterVolumeSpecName: "kube-api-access-jm86t") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "kube-api-access-jm86t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.087286 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.088428 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.187948 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.187982 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.187993 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.660492 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"8cd691b05592e7b3c264b7fbed06d48d81cd99fb31ca0c955cfed3fd066caee3"} Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.660925 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cd691b05592e7b3c264b7fbed06d48d81cd99fb31ca0c955cfed3fd066caee3" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.660615 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.541897 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.610785 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.876529 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.917913 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:12 crc kubenswrapper[4774]: I0127 00:30:12.100096 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:12 crc kubenswrapper[4774]: I0127 00:30:12.150905 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:12 crc kubenswrapper[4774]: I0127 00:30:12.341368 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:30:13 crc kubenswrapper[4774]: I0127 00:30:13.699125 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-672r9" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" containerID="cri-o://e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" gracePeriod=2 Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.178020 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.358150 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"9e60fce2-faf3-4f7c-a798-668f9658a557\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.358625 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"9e60fce2-faf3-4f7c-a798-668f9658a557\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.358649 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"9e60fce2-faf3-4f7c-a798-668f9658a557\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.359491 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities" (OuterVolumeSpecName: "utilities") pod "9e60fce2-faf3-4f7c-a798-668f9658a557" (UID: "9e60fce2-faf3-4f7c-a798-668f9658a557"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.362641 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.366056 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd" (OuterVolumeSpecName: "kube-api-access-bbkqd") pod "9e60fce2-faf3-4f7c-a798-668f9658a557" (UID: "9e60fce2-faf3-4f7c-a798-668f9658a557"). InnerVolumeSpecName "kube-api-access-bbkqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.464196 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.494724 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e60fce2-faf3-4f7c-a798-668f9658a557" (UID: "9e60fce2-faf3-4f7c-a798-668f9658a557"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.565414 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708128 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" exitCode=0 Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708191 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708189 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0"} Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708284 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"bc478ee9fcc61b8b2a27262557b35effaf576b0e163905e01aa0bfa4d5e05938"} Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708321 4774 scope.go:117] "RemoveContainer" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.728175 4774 scope.go:117] "RemoveContainer" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.745574 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.748972 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.763144 4774 scope.go:117] "RemoveContainer" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.782961 4774 scope.go:117] "RemoveContainer" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" Jan 27 00:30:14 crc kubenswrapper[4774]: E0127 00:30:14.783781 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0\": container with ID starting with e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0 not found: ID does not exist" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.783831 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0"} err="failed to get container status \"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0\": rpc error: code = NotFound desc = could not find container \"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0\": container with ID starting with e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0 not found: ID does not exist" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.783890 4774 scope.go:117] "RemoveContainer" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" Jan 27 00:30:14 crc kubenswrapper[4774]: E0127 00:30:14.784560 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090\": container with ID starting with 8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090 not found: ID does not exist" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.784620 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090"} err="failed to get container status \"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090\": rpc error: code = NotFound desc = could not find container \"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090\": container with ID starting with 8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090 not found: ID does not exist" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.784659 4774 scope.go:117] "RemoveContainer" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" Jan 27 00:30:14 crc kubenswrapper[4774]: E0127 00:30:14.785247 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e\": container with ID starting with 39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e not found: ID does not exist" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.785268 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e"} err="failed to get container status \"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e\": rpc error: code = NotFound desc = could not find container \"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e\": container with ID starting with 39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e not found: ID does not exist" Jan 27 00:30:16 crc kubenswrapper[4774]: I0127 00:30:16.366507 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" path="/var/lib/kubelet/pods/9e60fce2-faf3-4f7c-a798-668f9658a557/volumes" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.152439 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154338 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-content" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154365 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-content" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154381 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerName="collect-profiles" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154389 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerName="collect-profiles" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154401 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="git-clone" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154410 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="git-clone" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154421 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-utilities" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154429 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-utilities" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154440 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154448 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154464 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="manage-dockerfile" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154472 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="manage-dockerfile" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154490 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="docker-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154497 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="docker-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154639 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154652 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerName="collect-profiles" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154670 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="docker-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.155565 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.157797 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.158120 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.158714 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.162013 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.172118 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345331 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345386 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345569 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345614 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345641 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345689 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345732 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345779 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345804 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345827 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447778 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447878 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447929 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448007 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448067 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448128 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448200 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448234 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448267 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448366 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448553 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449164 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449317 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449366 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450198 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450669 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450754 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450790 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.456032 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.456706 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.469635 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.515039 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.949496 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.995926 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.997202 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.014744 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.056730 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.057024 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.057367 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.158546 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.158624 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.158662 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.159181 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.159276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.185019 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.331409 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.643731 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.770004 4774 generic.go:334] "Generic (PLEG): container finished" podID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerID="7bdfb69e9fa9696e37a7d9b25d175b63e5343122a1e569faca1c08955f8bbb1d" exitCode=0 Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.770105 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerDied","Data":"7bdfb69e9fa9696e37a7d9b25d175b63e5343122a1e569faca1c08955f8bbb1d"} Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.770166 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerStarted","Data":"d81d42440ab00778c86f6fcb4c62ec165af109b644786779b55c38f5d05a2e58"} Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.773061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerStarted","Data":"1d14e5556a15340dd3b0becb619b1db4a8136052b768e1c956c0feb6736d4ddb"} Jan 27 00:30:22 crc kubenswrapper[4774]: E0127 00:30:22.045831 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f2f66a_e5f3_4fa4_922f_048d076eb189.slice/crio-3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023.scope\": RecentStats: unable to find data in memory cache]" Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.784237 4774 generic.go:334] "Generic (PLEG): container finished" podID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" exitCode=0 Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.784369 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023"} Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.787304 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_69f0575b-205f-45e8-9dc2-6dc9a58677b3/docker-build/0.log" Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.787781 4774 generic.go:334] "Generic (PLEG): container finished" podID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerID="89911886b25b8c950e2ce162b903638384fd14c3a7c90ec3cfa3cf3f5c1e0950" exitCode=1 Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.787832 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerDied","Data":"89911886b25b8c950e2ce162b903638384fd14c3a7c90ec3cfa3cf3f5c1e0950"} Jan 27 00:30:23 crc kubenswrapper[4774]: I0127 00:30:23.797934 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerStarted","Data":"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0"} Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.158303 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_69f0575b-205f-45e8-9dc2-6dc9a58677b3/docker-build/0.log" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.159060 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.215618 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.215946 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216058 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216139 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216214 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216285 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216451 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216642 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216543 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216767 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216687 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216889 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216930 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216955 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217092 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217099 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217160 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217473 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217619 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217682 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217737 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217799 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217872 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217942 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.218004 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217727 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.218470 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.221099 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.221452 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.221550 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8" (OuterVolumeSpecName: "kube-api-access-fptm8") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "kube-api-access-fptm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320706 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320774 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320795 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320814 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320834 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805157 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_69f0575b-205f-45e8-9dc2-6dc9a58677b3/docker-build/0.log" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805561 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805517 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerDied","Data":"d81d42440ab00778c86f6fcb4c62ec165af109b644786779b55c38f5d05a2e58"} Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805894 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d81d42440ab00778c86f6fcb4c62ec165af109b644786779b55c38f5d05a2e58" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.808472 4774 generic.go:334] "Generic (PLEG): container finished" podID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" exitCode=0 Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.808504 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0"} Jan 27 00:30:25 crc kubenswrapper[4774]: I0127 00:30:25.817770 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerStarted","Data":"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d"} Jan 27 00:30:25 crc kubenswrapper[4774]: I0127 00:30:25.839002 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4hn7v" podStartSLOduration=3.282519349 podStartE2EDuration="5.838974781s" podCreationTimestamp="2026-01-27 00:30:20 +0000 UTC" firstStartedPulling="2026-01-27 00:30:22.786445697 +0000 UTC m=+1401.092222601" lastFinishedPulling="2026-01-27 00:30:25.342901149 +0000 UTC m=+1403.648678033" observedRunningTime="2026-01-27 00:30:25.833105274 +0000 UTC m=+1404.138882188" watchObservedRunningTime="2026-01-27 00:30:25.838974781 +0000 UTC m=+1404.144751665" Jan 27 00:30:30 crc kubenswrapper[4774]: I0127 00:30:30.649943 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:30 crc kubenswrapper[4774]: I0127 00:30:30.657536 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.331905 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.331992 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.388791 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.914568 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.965015 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330003 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Jan 27 00:30:32 crc kubenswrapper[4774]: E0127 00:30:32.330399 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="manage-dockerfile" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330430 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="manage-dockerfile" Jan 27 00:30:32 crc kubenswrapper[4774]: E0127 00:30:32.330478 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="docker-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330493 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="docker-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330692 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="docker-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.335673 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.337838 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338247 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338277 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338296 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338396 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338389 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338492 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338638 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338758 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338802 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338834 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338877 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338926 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.339019 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.340034 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.356364 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.366497 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" path="/var/lib/kubelet/pods/69f0575b-205f-45e8-9dc2-6dc9a58677b3/volumes" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439724 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439771 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439793 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439850 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439879 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439898 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440154 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439915 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440416 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440528 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440596 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.441749 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.441879 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.441952 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442045 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442271 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442289 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.446662 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.446754 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.458541 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.655277 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.898162 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Jan 27 00:30:33 crc kubenswrapper[4774]: I0127 00:30:33.878077 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerStarted","Data":"805fe66c3ab78445d712fcea5389d82d0f6bf329d62105c8d671658dfa8da638"} Jan 27 00:30:33 crc kubenswrapper[4774]: I0127 00:30:33.878415 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerStarted","Data":"2d2f2297cd3f2b3bd4e7f55c45acb60684ffc1e32ed0278ab3f333601623b81f"} Jan 27 00:30:33 crc kubenswrapper[4774]: I0127 00:30:33.878269 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4hn7v" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" containerID="cri-o://fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" gracePeriod=2 Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.249065 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.267184 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"01f2f66a-e5f3-4fa4-922f-048d076eb189\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.267289 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"01f2f66a-e5f3-4fa4-922f-048d076eb189\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.267440 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"01f2f66a-e5f3-4fa4-922f-048d076eb189\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.268058 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities" (OuterVolumeSpecName: "utilities") pod "01f2f66a-e5f3-4fa4-922f-048d076eb189" (UID: "01f2f66a-e5f3-4fa4-922f-048d076eb189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.274559 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp" (OuterVolumeSpecName: "kube-api-access-6tvvp") pod "01f2f66a-e5f3-4fa4-922f-048d076eb189" (UID: "01f2f66a-e5f3-4fa4-922f-048d076eb189"). InnerVolumeSpecName "kube-api-access-6tvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.292454 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.292508 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.344293 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01f2f66a-e5f3-4fa4-922f-048d076eb189" (UID: "01f2f66a-e5f3-4fa4-922f-048d076eb189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.394308 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886406 4774 generic.go:334] "Generic (PLEG): container finished" podID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" exitCode=0 Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886489 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d"} Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886517 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886542 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"1d14e5556a15340dd3b0becb619b1db4a8136052b768e1c956c0feb6736d4ddb"} Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886578 4774 scope.go:117] "RemoveContainer" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.888114 4774 generic.go:334] "Generic (PLEG): container finished" podID="a9452159-5284-43d8-a321-ddb59019b55d" containerID="805fe66c3ab78445d712fcea5389d82d0f6bf329d62105c8d671658dfa8da638" exitCode=0 Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.888177 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"805fe66c3ab78445d712fcea5389d82d0f6bf329d62105c8d671658dfa8da638"} Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.932456 4774 scope.go:117] "RemoveContainer" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.978026 4774 scope.go:117] "RemoveContainer" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.992362 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.000845 4774 scope.go:117] "RemoveContainer" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" Jan 27 00:30:35 crc kubenswrapper[4774]: E0127 00:30:35.001516 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d\": container with ID starting with fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d not found: ID does not exist" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.001681 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d"} err="failed to get container status \"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d\": rpc error: code = NotFound desc = could not find container \"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d\": container with ID starting with fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d not found: ID does not exist" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.001769 4774 scope.go:117] "RemoveContainer" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" Jan 27 00:30:35 crc kubenswrapper[4774]: E0127 00:30:35.002364 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0\": container with ID starting with 88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0 not found: ID does not exist" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.002426 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0"} err="failed to get container status \"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0\": rpc error: code = NotFound desc = could not find container \"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0\": container with ID starting with 88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0 not found: ID does not exist" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.002449 4774 scope.go:117] "RemoveContainer" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" Jan 27 00:30:35 crc kubenswrapper[4774]: E0127 00:30:35.002936 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023\": container with ID starting with 3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023 not found: ID does not exist" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.003016 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023"} err="failed to get container status \"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023\": rpc error: code = NotFound desc = could not find container \"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023\": container with ID starting with 3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023 not found: ID does not exist" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.010890 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.898919 4774 generic.go:334] "Generic (PLEG): container finished" podID="a9452159-5284-43d8-a321-ddb59019b55d" containerID="7ac4abecb15fc0a8b695048482910c50ba464b35c70e3a489852335144587a88" exitCode=0 Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.899112 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"7ac4abecb15fc0a8b695048482910c50ba464b35c70e3a489852335144587a88"} Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.944554 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_a9452159-5284-43d8-a321-ddb59019b55d/manage-dockerfile/0.log" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.364500 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" path="/var/lib/kubelet/pods/01f2f66a-e5f3-4fa4-922f-048d076eb189/volumes" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.676059 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.676470 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.676521 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.677287 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.677350 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5" gracePeriod=600 Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.913010 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5" exitCode=0 Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.913089 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5"} Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.913187 4774 scope.go:117] "RemoveContainer" containerID="7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.917413 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerStarted","Data":"151be50c8b2cb5c05bfac6a97d7a3b29597ca3689c36ab5dc3b127cd32f2dcfd"} Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.972362 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.972330468 podStartE2EDuration="4.972330468s" podCreationTimestamp="2026-01-27 00:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:30:36.964602061 +0000 UTC m=+1415.270378955" watchObservedRunningTime="2026-01-27 00:30:36.972330468 +0000 UTC m=+1415.278107362" Jan 27 00:30:37 crc kubenswrapper[4774]: I0127 00:30:37.928221 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541"} Jan 27 00:30:39 crc kubenswrapper[4774]: I0127 00:30:39.964132 4774 generic.go:334] "Generic (PLEG): container finished" podID="a9452159-5284-43d8-a321-ddb59019b55d" containerID="151be50c8b2cb5c05bfac6a97d7a3b29597ca3689c36ab5dc3b127cd32f2dcfd" exitCode=0 Jan 27 00:30:39 crc kubenswrapper[4774]: I0127 00:30:39.964229 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"151be50c8b2cb5c05bfac6a97d7a3b29597ca3689c36ab5dc3b127cd32f2dcfd"} Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.218293 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266120 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266205 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266264 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266354 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266418 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266464 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266513 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266552 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266584 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266618 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266649 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266694 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267221 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267283 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267440 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267752 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.268275 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.273517 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.274160 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.274340 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.274627 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.275177 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr" (OuterVolumeSpecName: "kube-api-access-nltjr") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "kube-api-access-nltjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.276138 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.368535 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369094 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369166 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369225 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369283 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369335 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369393 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369450 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369502 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369554 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369607 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369705 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.981927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"2d2f2297cd3f2b3bd4e7f55c45acb60684ffc1e32ed0278ab3f333601623b81f"} Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.981973 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d2f2297cd3f2b3bd4e7f55c45acb60684ffc1e32ed0278ab3f333601623b81f" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.982008 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.313467 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314081 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314097 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314107 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="docker-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314114 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="docker-build" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314127 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="manage-dockerfile" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314135 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="manage-dockerfile" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314147 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-utilities" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314155 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-utilities" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314178 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-content" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314186 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-content" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314196 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="git-clone" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314206 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="git-clone" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314328 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="docker-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314349 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.315187 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320571 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320571 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320784 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320937 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.334876 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436310 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436341 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436387 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436403 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436427 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436449 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436468 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436486 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436504 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436535 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.537999 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538110 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538162 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538237 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538288 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538312 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538338 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538365 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.539460 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.539680 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.539930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.540105 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.540524 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541203 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541374 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541739 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.547703 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.547744 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.564205 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.633892 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:46 crc kubenswrapper[4774]: I0127 00:30:46.116338 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:47 crc kubenswrapper[4774]: I0127 00:30:47.042312 4774 generic.go:334] "Generic (PLEG): container finished" podID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerID="14336beb86026bc90d2a9fb6311a3227ae7133bc262a6b935592389475831309" exitCode=0 Jan 27 00:30:47 crc kubenswrapper[4774]: I0127 00:30:47.042432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerDied","Data":"14336beb86026bc90d2a9fb6311a3227ae7133bc262a6b935592389475831309"} Jan 27 00:30:47 crc kubenswrapper[4774]: I0127 00:30:47.044392 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerStarted","Data":"1e33de5c76cbf6c9b219f3d4baa1dfaea96de503848239d1c8ca6f198d8bc8a5"} Jan 27 00:30:48 crc kubenswrapper[4774]: I0127 00:30:48.055104 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_92a02e6d-fee7-4a15-b923-7da2262b3324/docker-build/0.log" Jan 27 00:30:48 crc kubenswrapper[4774]: I0127 00:30:48.057396 4774 generic.go:334] "Generic (PLEG): container finished" podID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerID="49b5f230df2b7f05eeb23873ef8a668173f6d60ab2c8dbad4bbe6685a4aa6248" exitCode=1 Jan 27 00:30:48 crc kubenswrapper[4774]: I0127 00:30:48.057451 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerDied","Data":"49b5f230df2b7f05eeb23873ef8a668173f6d60ab2c8dbad4bbe6685a4aa6248"} Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.334544 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_92a02e6d-fee7-4a15-b923-7da2262b3324/docker-build/0.log" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.335317 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499350 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499482 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499519 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499586 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499621 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499673 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499742 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499788 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499816 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499843 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499893 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499931 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.501812 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502601 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502771 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.501805 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.503125 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.503332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.504473 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.504557 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt" (OuterVolumeSpecName: "kube-api-access-x4mxt") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "kube-api-access-x4mxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.505179 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601443 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601507 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601526 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601549 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601566 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601578 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601590 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601603 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601614 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601627 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601639 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601651 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.071680 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_92a02e6d-fee7-4a15-b923-7da2262b3324/docker-build/0.log" Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.072158 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerDied","Data":"1e33de5c76cbf6c9b219f3d4baa1dfaea96de503848239d1c8ca6f198d8bc8a5"} Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.072199 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e33de5c76cbf6c9b219f3d4baa1dfaea96de503848239d1c8ca6f198d8bc8a5" Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.072238 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:55 crc kubenswrapper[4774]: I0127 00:30:55.772157 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:55 crc kubenswrapper[4774]: I0127 00:30:55.781825 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:56 crc kubenswrapper[4774]: I0127 00:30:56.372590 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" path="/var/lib/kubelet/pods/92a02e6d-fee7-4a15-b923-7da2262b3324/volumes" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.486977 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Jan 27 00:30:57 crc kubenswrapper[4774]: E0127 00:30:57.487364 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="manage-dockerfile" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.487386 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="manage-dockerfile" Jan 27 00:30:57 crc kubenswrapper[4774]: E0127 00:30:57.487405 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="docker-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.487416 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="docker-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.487581 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="docker-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.488960 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.492793 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.492814 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.493384 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.498693 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545393 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545466 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545600 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545626 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545669 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545737 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545826 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545928 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545953 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545976 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.548428 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.646920 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.646980 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647000 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647031 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647053 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647101 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647132 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647153 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647179 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647202 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647224 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647448 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647539 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647594 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.648099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.648207 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649132 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649066 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649317 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.656271 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.656285 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.678851 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.810971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:58 crc kubenswrapper[4774]: I0127 00:30:58.258220 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Jan 27 00:30:58 crc kubenswrapper[4774]: W0127 00:30:58.268229 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcff99b3_b2b8_4a05_b19c_1498f32af54f.slice/crio-cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b WatchSource:0}: Error finding container cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b: Status 404 returned error can't find the container with id cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b Jan 27 00:30:59 crc kubenswrapper[4774]: I0127 00:30:59.140746 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerStarted","Data":"2b5d96bc668cc1b8799c7e981e3b41a835d8526d3b481d954264c13d6e5699d3"} Jan 27 00:30:59 crc kubenswrapper[4774]: I0127 00:30:59.141089 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerStarted","Data":"cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b"} Jan 27 00:31:00 crc kubenswrapper[4774]: I0127 00:31:00.153852 4774 generic.go:334] "Generic (PLEG): container finished" podID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerID="2b5d96bc668cc1b8799c7e981e3b41a835d8526d3b481d954264c13d6e5699d3" exitCode=0 Jan 27 00:31:00 crc kubenswrapper[4774]: I0127 00:31:00.153922 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"2b5d96bc668cc1b8799c7e981e3b41a835d8526d3b481d954264c13d6e5699d3"} Jan 27 00:31:01 crc kubenswrapper[4774]: I0127 00:31:01.167053 4774 generic.go:334] "Generic (PLEG): container finished" podID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerID="45e593b1bf5a453c9b422463098e79f573320f2853ae4259742d4030ee049132" exitCode=0 Jan 27 00:31:01 crc kubenswrapper[4774]: I0127 00:31:01.167138 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"45e593b1bf5a453c9b422463098e79f573320f2853ae4259742d4030ee049132"} Jan 27 00:31:01 crc kubenswrapper[4774]: I0127 00:31:01.217181 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_fcff99b3-b2b8-4a05-b19c-1498f32af54f/manage-dockerfile/0.log" Jan 27 00:31:02 crc kubenswrapper[4774]: I0127 00:31:02.179218 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerStarted","Data":"32bed463a3ccb3de624f33f4d2d5399fb499d0c44ef103015a08bbc48c8cae78"} Jan 27 00:31:02 crc kubenswrapper[4774]: I0127 00:31:02.216883 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.216827574 podStartE2EDuration="5.216827574s" podCreationTimestamp="2026-01-27 00:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:31:02.213398721 +0000 UTC m=+1440.519175645" watchObservedRunningTime="2026-01-27 00:31:02.216827574 +0000 UTC m=+1440.522604488" Jan 27 00:31:05 crc kubenswrapper[4774]: I0127 00:31:05.204674 4774 generic.go:334] "Generic (PLEG): container finished" podID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerID="32bed463a3ccb3de624f33f4d2d5399fb499d0c44ef103015a08bbc48c8cae78" exitCode=0 Jan 27 00:31:05 crc kubenswrapper[4774]: I0127 00:31:05.205664 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"32bed463a3ccb3de624f33f4d2d5399fb499d0c44ef103015a08bbc48c8cae78"} Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.555317 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690252 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690385 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690422 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690455 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690493 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690545 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690603 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690677 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690738 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690774 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690815 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690816 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690929 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691337 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691388 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691609 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691646 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691686 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.692372 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.692397 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.693695 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.696837 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.697584 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.698576 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl" (OuterVolumeSpecName: "kube-api-access-r4mrl") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "kube-api-access-r4mrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.701423 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793142 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793193 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793212 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793229 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793247 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793262 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793278 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793295 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793311 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793329 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793344 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:07 crc kubenswrapper[4774]: I0127 00:31:07.239346 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b"} Jan 27 00:31:07 crc kubenswrapper[4774]: I0127 00:31:07.239424 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b" Jan 27 00:31:07 crc kubenswrapper[4774]: I0127 00:31:07.239573 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.115562 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 27 00:31:24 crc kubenswrapper[4774]: E0127 00:31:24.116889 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="manage-dockerfile" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.116911 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="manage-dockerfile" Jan 27 00:31:24 crc kubenswrapper[4774]: E0127 00:31:24.116930 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="git-clone" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.116944 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="git-clone" Jan 27 00:31:24 crc kubenswrapper[4774]: E0127 00:31:24.116964 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="docker-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.116975 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="docker-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.117145 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="docker-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.119019 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.125438 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.125674 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.125847 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.126430 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.128281 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.153193 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271254 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271384 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271478 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271512 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271539 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271567 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271610 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271641 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271671 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271838 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271891 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.374987 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375077 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375127 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375163 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375210 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375322 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375389 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375422 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375448 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375479 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375529 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.376266 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.376529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.381464 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382126 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382494 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382522 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382574 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382839 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.383028 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.387785 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.387934 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.388337 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.409166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.443725 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.786414 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 27 00:31:25 crc kubenswrapper[4774]: I0127 00:31:25.399456 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerStarted","Data":"0aac3241d5cb55fd33c30eb348f2d910363507d410c6e427004e9f23f1c35deb"} Jan 27 00:31:25 crc kubenswrapper[4774]: I0127 00:31:25.399984 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerStarted","Data":"2535846c7c29096f9e9f5c73067f9cff2bd19eb11c98df80dd4d6a08c09a677e"} Jan 27 00:31:26 crc kubenswrapper[4774]: I0127 00:31:26.410440 4774 generic.go:334] "Generic (PLEG): container finished" podID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerID="0aac3241d5cb55fd33c30eb348f2d910363507d410c6e427004e9f23f1c35deb" exitCode=0 Jan 27 00:31:26 crc kubenswrapper[4774]: I0127 00:31:26.410522 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"0aac3241d5cb55fd33c30eb348f2d910363507d410c6e427004e9f23f1c35deb"} Jan 27 00:31:27 crc kubenswrapper[4774]: I0127 00:31:27.423705 4774 generic.go:334] "Generic (PLEG): container finished" podID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerID="288eefec81b0e990905f2d152c7dee1754da7b88b63cb0100e9830b4ad22e8da" exitCode=0 Jan 27 00:31:27 crc kubenswrapper[4774]: I0127 00:31:27.423769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"288eefec81b0e990905f2d152c7dee1754da7b88b63cb0100e9830b4ad22e8da"} Jan 27 00:31:27 crc kubenswrapper[4774]: I0127 00:31:27.473599 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d/manage-dockerfile/0.log" Jan 27 00:31:28 crc kubenswrapper[4774]: I0127 00:31:28.434822 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerStarted","Data":"8e0c1422d3f12ac19f4457cba5615e0ac867694b08ef9415ad1cd3f3c7dbeda8"} Jan 27 00:31:28 crc kubenswrapper[4774]: I0127 00:31:28.499109 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=4.499076191 podStartE2EDuration="4.499076191s" podCreationTimestamp="2026-01-27 00:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:31:28.47328875 +0000 UTC m=+1466.779065644" watchObservedRunningTime="2026-01-27 00:31:28.499076191 +0000 UTC m=+1466.804853095" Jan 27 00:31:57 crc kubenswrapper[4774]: I0127 00:31:57.652161 4774 generic.go:334] "Generic (PLEG): container finished" podID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerID="8e0c1422d3f12ac19f4457cba5615e0ac867694b08ef9415ad1cd3f3c7dbeda8" exitCode=0 Jan 27 00:31:57 crc kubenswrapper[4774]: I0127 00:31:57.652274 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"8e0c1422d3f12ac19f4457cba5615e0ac867694b08ef9415ad1cd3f3c7dbeda8"} Jan 27 00:31:58 crc kubenswrapper[4774]: I0127 00:31:58.942041 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058634 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058706 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058736 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058794 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058845 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058932 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058991 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059023 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059047 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059082 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059119 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059176 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059557 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059985 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.060731 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.060808 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.061005 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.060998 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.061316 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.067906 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.069564 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8" (OuterVolumeSpecName: "kube-api-access-f9cf8") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "kube-api-access-f9cf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.069639 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.071116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161313 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161377 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161401 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161419 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161437 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161454 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161472 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161490 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161508 4774 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161528 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161591 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.265213 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.365171 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.676158 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"2535846c7c29096f9e9f5c73067f9cff2bd19eb11c98df80dd4d6a08c09a677e"} Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.676206 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2535846c7c29096f9e9f5c73067f9cff2bd19eb11c98df80dd4d6a08c09a677e" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.676299 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.426528 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.485083 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.852935 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:00 crc kubenswrapper[4774]: E0127 00:32:00.853678 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="docker-build" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.853723 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="docker-build" Jan 27 00:32:00 crc kubenswrapper[4774]: E0127 00:32:00.853759 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="manage-dockerfile" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.853779 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="manage-dockerfile" Jan 27 00:32:00 crc kubenswrapper[4774]: E0127 00:32:00.853825 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="git-clone" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.853843 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="git-clone" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.854122 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="docker-build" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.855153 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.856421 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.860399 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-lx54v" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.992029 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"infrawatch-operators-vwmhk\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.094020 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"infrawatch-operators-vwmhk\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.111056 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"infrawatch-operators-vwmhk\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.174761 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.432443 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:01 crc kubenswrapper[4774]: W0127 00:32:01.438037 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcbe95ca_c88b_4959_ba19_a9f1fb1bbfd8.slice/crio-b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319 WatchSource:0}: Error finding container b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319: Status 404 returned error can't find the container with id b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319 Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.699602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerStarted","Data":"b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319"} Jan 27 00:32:05 crc kubenswrapper[4774]: I0127 00:32:05.226267 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.033990 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-cn27j"] Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.035159 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.048563 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-cn27j"] Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.070438 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh4mw\" (UniqueName: \"kubernetes.io/projected/59e04641-33ea-4c4e-8ebd-8cf84728cd95-kube-api-access-kh4mw\") pod \"infrawatch-operators-cn27j\" (UID: \"59e04641-33ea-4c4e-8ebd-8cf84728cd95\") " pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.172647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh4mw\" (UniqueName: \"kubernetes.io/projected/59e04641-33ea-4c4e-8ebd-8cf84728cd95-kube-api-access-kh4mw\") pod \"infrawatch-operators-cn27j\" (UID: \"59e04641-33ea-4c4e-8ebd-8cf84728cd95\") " pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.196125 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh4mw\" (UniqueName: \"kubernetes.io/projected/59e04641-33ea-4c4e-8ebd-8cf84728cd95-kube-api-access-kh4mw\") pod \"infrawatch-operators-cn27j\" (UID: \"59e04641-33ea-4c4e-8ebd-8cf84728cd95\") " pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.355552 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.111849 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-cn27j"] Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.788552 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-cn27j" event={"ID":"59e04641-33ea-4c4e-8ebd-8cf84728cd95","Type":"ContainerStarted","Data":"4ed4e3fb8816316d4ac7d9c7d093d5f695a4ef5a205546d613809d8ff377ffdc"} Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.788891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-cn27j" event={"ID":"59e04641-33ea-4c4e-8ebd-8cf84728cd95","Type":"ContainerStarted","Data":"c6e03c1650e7688d361558deb96a907e1ae5d0c30ce51b720fa51b06dc2237f4"} Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.790651 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerStarted","Data":"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299"} Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.790851 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-vwmhk" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" containerID="cri-o://5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" gracePeriod=2 Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.814501 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-cn27j" podStartSLOduration=6.717040112 podStartE2EDuration="6.814472723s" podCreationTimestamp="2026-01-27 00:32:06 +0000 UTC" firstStartedPulling="2026-01-27 00:32:12.117024335 +0000 UTC m=+1510.422801229" lastFinishedPulling="2026-01-27 00:32:12.214456956 +0000 UTC m=+1510.520233840" observedRunningTime="2026-01-27 00:32:12.812827999 +0000 UTC m=+1511.118604893" watchObservedRunningTime="2026-01-27 00:32:12.814472723 +0000 UTC m=+1511.120249627" Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.832003 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-vwmhk" podStartSLOduration=2.41902672 podStartE2EDuration="12.831979373s" podCreationTimestamp="2026-01-27 00:32:00 +0000 UTC" firstStartedPulling="2026-01-27 00:32:01.440203151 +0000 UTC m=+1499.745980035" lastFinishedPulling="2026-01-27 00:32:11.853155784 +0000 UTC m=+1510.158932688" observedRunningTime="2026-01-27 00:32:12.827340068 +0000 UTC m=+1511.133116952" watchObservedRunningTime="2026-01-27 00:32:12.831979373 +0000 UTC m=+1511.137756267" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.185569 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.327353 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.332361 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl" (OuterVolumeSpecName: "kube-api-access-j7kgl") pod "bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" (UID: "bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8"). InnerVolumeSpecName "kube-api-access-j7kgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.429181 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.803942 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" exitCode=0 Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804001 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804018 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerDied","Data":"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299"} Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804102 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerDied","Data":"b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319"} Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804181 4774 scope.go:117] "RemoveContainer" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.825780 4774 scope.go:117] "RemoveContainer" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" Jan 27 00:32:13 crc kubenswrapper[4774]: E0127 00:32:13.826299 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299\": container with ID starting with 5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299 not found: ID does not exist" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.826344 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299"} err="failed to get container status \"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299\": rpc error: code = NotFound desc = could not find container \"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299\": container with ID starting with 5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299 not found: ID does not exist" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.844175 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.849194 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:14 crc kubenswrapper[4774]: I0127 00:32:14.365521 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" path="/var/lib/kubelet/pods/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8/volumes" Jan 27 00:32:16 crc kubenswrapper[4774]: I0127 00:32:16.364537 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:16 crc kubenswrapper[4774]: I0127 00:32:16.365165 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:16 crc kubenswrapper[4774]: I0127 00:32:16.407108 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:26 crc kubenswrapper[4774]: I0127 00:32:26.405930 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.710213 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7"] Jan 27 00:32:33 crc kubenswrapper[4774]: E0127 00:32:33.711592 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.711615 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.711833 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.713302 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.735575 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7"] Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.833512 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.833636 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.833691 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.935616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.935673 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.935785 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.936359 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.936590 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.961937 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.055242 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.328200 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7"] Jan 27 00:32:34 crc kubenswrapper[4774]: W0127 00:32:34.333148 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4a7d0e_83e6_4280_b61b_9741ebc0e7b8.slice/crio-817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97 WatchSource:0}: Error finding container 817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97: Status 404 returned error can't find the container with id 817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97 Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.583963 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6"] Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.585218 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.586434 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6"] Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.776635 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.777209 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.777260 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.878549 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879102 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879357 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879509 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879661 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.899947 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.901546 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.988770 4774 generic.go:334] "Generic (PLEG): container finished" podID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerID="f87228d204d05e2c7edfba62efba195ea1177f1d742fa63691cff26a62826ec8" exitCode=0 Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.988820 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"f87228d204d05e2c7edfba62efba195ea1177f1d742fa63691cff26a62826ec8"} Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.988853 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerStarted","Data":"817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97"} Jan 27 00:32:35 crc kubenswrapper[4774]: I0127 00:32:35.203520 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6"] Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:35.999959 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerID="7c2af4f30102be4e7a9548c15d0ad6af09b9f6809929ebe6feb135743669b8fd" exitCode=0 Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.000095 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"7c2af4f30102be4e7a9548c15d0ad6af09b9f6809929ebe6feb135743669b8fd"} Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.000140 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerStarted","Data":"c5035b3849c2e389030163c442ab41df56f1602f87c1dae5841a6e2bd5012072"} Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.007636 4774 generic.go:334] "Generic (PLEG): container finished" podID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerID="f0e75a394cc094a5f4cec0ec7069884dbd9edda6ba29db3b1174a591ce8c3a8f" exitCode=0 Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.007718 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"f0e75a394cc094a5f4cec0ec7069884dbd9edda6ba29db3b1174a591ce8c3a8f"} Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.675900 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.676007 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:32:37 crc kubenswrapper[4774]: I0127 00:32:37.022373 4774 generic.go:334] "Generic (PLEG): container finished" podID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerID="cb3a7f05e8d9c8671aff7b6c09b3c0b0699a1a069036af739ee21f3e6a9ba97b" exitCode=0 Jan 27 00:32:37 crc kubenswrapper[4774]: I0127 00:32:37.022452 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"cb3a7f05e8d9c8671aff7b6c09b3c0b0699a1a069036af739ee21f3e6a9ba97b"} Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.032012 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerID="a2fe27398823d30032f627a91fc5c0a9fedb46b77683c07313850ad63a7b211c" exitCode=0 Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.032388 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"a2fe27398823d30032f627a91fc5c0a9fedb46b77683c07313850ad63a7b211c"} Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.331075 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336010 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336052 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336088 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336815 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle" (OuterVolumeSpecName: "bundle") pod "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" (UID: "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.344116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45" (OuterVolumeSpecName: "kube-api-access-n7d45") pod "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" (UID: "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8"). InnerVolumeSpecName "kube-api-access-n7d45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.369835 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util" (OuterVolumeSpecName: "util") pod "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" (UID: "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.441609 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.441751 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.441783 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.046095 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerID="3f2e4169dbca4055ddabddcbc7c0c8c18ecded89bc78879bc04e58cb8423ee4a" exitCode=0 Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.046265 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"3f2e4169dbca4055ddabddcbc7c0c8c18ecded89bc78879bc04e58cb8423ee4a"} Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.049396 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97"} Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.049826 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97" Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.049773 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.434178 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.575736 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"ba2c0178-4760-4b77-990e-35b2bcd11729\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.576010 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"ba2c0178-4760-4b77-990e-35b2bcd11729\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.576227 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"ba2c0178-4760-4b77-990e-35b2bcd11729\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.577291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle" (OuterVolumeSpecName: "bundle") pod "ba2c0178-4760-4b77-990e-35b2bcd11729" (UID: "ba2c0178-4760-4b77-990e-35b2bcd11729"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.584838 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m" (OuterVolumeSpecName: "kube-api-access-rkw2m") pod "ba2c0178-4760-4b77-990e-35b2bcd11729" (UID: "ba2c0178-4760-4b77-990e-35b2bcd11729"). InnerVolumeSpecName "kube-api-access-rkw2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.614606 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util" (OuterVolumeSpecName: "util") pod "ba2c0178-4760-4b77-990e-35b2bcd11729" (UID: "ba2c0178-4760-4b77-990e-35b2bcd11729"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.677968 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.678025 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.678046 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:41 crc kubenswrapper[4774]: I0127 00:32:41.076712 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"c5035b3849c2e389030163c442ab41df56f1602f87c1dae5841a6e2bd5012072"} Jan 27 00:32:41 crc kubenswrapper[4774]: I0127 00:32:41.076808 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:41 crc kubenswrapper[4774]: I0127 00:32:41.076814 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5035b3849c2e389030163c442ab41df56f1602f87c1dae5841a6e2bd5012072" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.714797 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f"] Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715815 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715829 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715837 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715843 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715883 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715890 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715899 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715905 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715919 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715925 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715942 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715947 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.716044 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.716057 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.716491 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.718736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-7q8nh" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.737291 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f"] Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.845001 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/61fa30c6-90bc-4b6c-b850-2b2e59506e08-runner\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.845104 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj27v\" (UniqueName: \"kubernetes.io/projected/61fa30c6-90bc-4b6c-b850-2b2e59506e08-kube-api-access-fj27v\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.946568 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/61fa30c6-90bc-4b6c-b850-2b2e59506e08-runner\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.946641 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj27v\" (UniqueName: \"kubernetes.io/projected/61fa30c6-90bc-4b6c-b850-2b2e59506e08-kube-api-access-fj27v\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.947136 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/61fa30c6-90bc-4b6c-b850-2b2e59506e08-runner\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.970555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj27v\" (UniqueName: \"kubernetes.io/projected/61fa30c6-90bc-4b6c-b850-2b2e59506e08-kube-api-access-fj27v\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:45 crc kubenswrapper[4774]: I0127 00:32:45.036738 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:45 crc kubenswrapper[4774]: I0127 00:32:45.339693 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f"] Jan 27 00:32:46 crc kubenswrapper[4774]: I0127 00:32:46.125443 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" event={"ID":"61fa30c6-90bc-4b6c-b850-2b2e59506e08","Type":"ContainerStarted","Data":"52e64aa0f0917d58a00bf3f233cb9c02240bdaf763b5e485dbb82ec246634fc2"} Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.577208 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-59f5866557-f7gmp"] Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.578902 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.584746 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-fp78c" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.604403 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-59f5866557-f7gmp"] Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.708025 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/14f2c5e3-3625-4889-97e4-38820ac84518-runner\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.708164 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdvx\" (UniqueName: \"kubernetes.io/projected/14f2c5e3-3625-4889-97e4-38820ac84518-kube-api-access-pbdvx\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.809191 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/14f2c5e3-3625-4889-97e4-38820ac84518-runner\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.809284 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdvx\" (UniqueName: \"kubernetes.io/projected/14f2c5e3-3625-4889-97e4-38820ac84518-kube-api-access-pbdvx\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.809835 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/14f2c5e3-3625-4889-97e4-38820ac84518-runner\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.838553 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdvx\" (UniqueName: \"kubernetes.io/projected/14f2c5e3-3625-4889-97e4-38820ac84518-kube-api-access-pbdvx\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.909629 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:48 crc kubenswrapper[4774]: I0127 00:32:48.176602 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-59f5866557-f7gmp"] Jan 27 00:32:49 crc kubenswrapper[4774]: I0127 00:32:49.151348 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" event={"ID":"14f2c5e3-3625-4889-97e4-38820ac84518","Type":"ContainerStarted","Data":"680b7b1e6ae9ae46a899b28a8678425ab02fa8641a17baf4f97ccb6c15d7030d"} Jan 27 00:33:01 crc kubenswrapper[4774]: E0127 00:33:01.378562 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Jan 27 00:33:01 crc kubenswrapper[4774]: E0127 00:33:01.379545 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1769473811,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fj27v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-7b4c7b595f-78n7f_service-telemetry(61fa30c6-90bc-4b6c-b850-2b2e59506e08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 00:33:01 crc kubenswrapper[4774]: E0127 00:33:01.380768 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" podUID="61fa30c6-90bc-4b6c-b850-2b2e59506e08" Jan 27 00:33:02 crc kubenswrapper[4774]: E0127 00:33:02.267267 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" podUID="61fa30c6-90bc-4b6c-b850-2b2e59506e08" Jan 27 00:33:06 crc kubenswrapper[4774]: I0127 00:33:06.675296 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:33:06 crc kubenswrapper[4774]: I0127 00:33:06.676101 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:33:07 crc kubenswrapper[4774]: I0127 00:33:07.304805 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" event={"ID":"14f2c5e3-3625-4889-97e4-38820ac84518","Type":"ContainerStarted","Data":"d7afeb04dd0ffd0243691462d6091a886d24389b3989eb022072fb60c4475839"} Jan 27 00:33:07 crc kubenswrapper[4774]: I0127 00:33:07.343177 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" podStartSLOduration=2.000696099 podStartE2EDuration="20.343147009s" podCreationTimestamp="2026-01-27 00:32:47 +0000 UTC" firstStartedPulling="2026-01-27 00:32:48.199343626 +0000 UTC m=+1546.505120510" lastFinishedPulling="2026-01-27 00:33:06.541794536 +0000 UTC m=+1564.847571420" observedRunningTime="2026-01-27 00:33:07.339124091 +0000 UTC m=+1565.644901035" watchObservedRunningTime="2026-01-27 00:33:07.343147009 +0000 UTC m=+1565.648923923" Jan 27 00:33:17 crc kubenswrapper[4774]: I0127 00:33:17.408419 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" event={"ID":"61fa30c6-90bc-4b6c-b850-2b2e59506e08","Type":"ContainerStarted","Data":"edb20735273418fbc3f51d9924e60b74128a6d585bbbb04ccd7623fc08a5d45d"} Jan 27 00:33:17 crc kubenswrapper[4774]: I0127 00:33:17.433076 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" podStartSLOduration=1.940795568 podStartE2EDuration="33.433042655s" podCreationTimestamp="2026-01-27 00:32:44 +0000 UTC" firstStartedPulling="2026-01-27 00:32:45.349215055 +0000 UTC m=+1543.654991939" lastFinishedPulling="2026-01-27 00:33:16.841462142 +0000 UTC m=+1575.147239026" observedRunningTime="2026-01-27 00:33:17.430420204 +0000 UTC m=+1575.736197098" watchObservedRunningTime="2026-01-27 00:33:17.433042655 +0000 UTC m=+1575.738819549" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.066763 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.077006 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.083944 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.084400 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.084613 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.084774 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.085036 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.085609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-cdfh6" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.085764 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.090190 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.202617 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203075 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203118 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203474 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203655 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203731 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203793 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305776 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305869 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305918 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305961 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.306000 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.306065 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.306118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.307463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.321932 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.325751 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.326650 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.333411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.333796 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.350766 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.411172 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.704683 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:33:35 crc kubenswrapper[4774]: I0127 00:33:35.597584 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerStarted","Data":"a5ba257ff6bd2d15abb20ec26902d31e3cb2085f1429dfba58273e45e14cd88c"} Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.675378 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.675954 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.676027 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.677271 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.677382 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" gracePeriod=600 Jan 27 00:33:36 crc kubenswrapper[4774]: E0127 00:33:36.820033 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.616574 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" exitCode=0 Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.616628 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541"} Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.616681 4774 scope.go:117] "RemoveContainer" containerID="1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5" Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.617679 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:33:37 crc kubenswrapper[4774]: E0127 00:33:37.619349 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:33:40 crc kubenswrapper[4774]: I0127 00:33:40.644927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerStarted","Data":"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f"} Jan 27 00:33:40 crc kubenswrapper[4774]: I0127 00:33:40.671286 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" podStartSLOduration=1.019683132 podStartE2EDuration="6.67126534s" podCreationTimestamp="2026-01-27 00:33:34 +0000 UTC" firstStartedPulling="2026-01-27 00:33:34.716476797 +0000 UTC m=+1593.022253681" lastFinishedPulling="2026-01-27 00:33:40.368059005 +0000 UTC m=+1598.673835889" observedRunningTime="2026-01-27 00:33:40.670315564 +0000 UTC m=+1598.976092488" watchObservedRunningTime="2026-01-27 00:33:40.67126534 +0000 UTC m=+1598.977042224" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.829200 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.832074 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.835186 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.835670 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.835798 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836059 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836403 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836440 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836755 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836837 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.839130 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.840818 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-lqwbh" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.852475 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018474 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018540 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018565 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018593 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-tls-assets\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018613 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhkc\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-kube-api-access-hnhkc\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019014 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71259922-2e93-4571-80c6-e054f4372056-config-out\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019445 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019541 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-web-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019595 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019675 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.020142 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.020237 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122535 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122620 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-web-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122669 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122716 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122797 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.122827 4774 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.122997 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls podName:71259922-2e93-4571-80c6-e054f4372056 nodeName:}" failed. No retries permitted until 2026-01-27 00:33:46.622956242 +0000 UTC m=+1604.928733306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "71259922-2e93-4571-80c6-e054f4372056") : secret "default-prometheus-proxy-tls" not found Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122855 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123559 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123642 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123821 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-tls-assets\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhkc\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-kube-api-access-hnhkc\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.124030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71259922-2e93-4571-80c6-e054f4372056-config-out\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.124904 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.125070 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.125473 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.126885 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.134578 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.136448 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.136800 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-tls-assets\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.143069 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-web-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.143243 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.143302 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/295e7c323f08c6637b248777cc2f8a161194fb17a8c00659f66f94d7e9196f62/globalmount\"" pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.156628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71259922-2e93-4571-80c6-e054f4372056-config-out\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.160446 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhkc\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-kube-api-access-hnhkc\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.186430 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.632225 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.632547 4774 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.632694 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls podName:71259922-2e93-4571-80c6-e054f4372056 nodeName:}" failed. No retries permitted until 2026-01-27 00:33:47.632648509 +0000 UTC m=+1605.938425433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "71259922-2e93-4571-80c6-e054f4372056") : secret "default-prometheus-proxy-tls" not found Jan 27 00:33:47 crc kubenswrapper[4774]: I0127 00:33:47.647946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:47 crc kubenswrapper[4774]: I0127 00:33:47.664633 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:47 crc kubenswrapper[4774]: I0127 00:33:47.953971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Jan 27 00:33:48 crc kubenswrapper[4774]: I0127 00:33:48.285093 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 27 00:33:48 crc kubenswrapper[4774]: I0127 00:33:48.714268 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"d6b041c8b3b87f133eeccbd1e07edb577775842dd494617252f6f3a9d37ad2a2"} Jan 27 00:33:50 crc kubenswrapper[4774]: I0127 00:33:50.357114 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:33:50 crc kubenswrapper[4774]: E0127 00:33:50.357416 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:33:52 crc kubenswrapper[4774]: I0127 00:33:52.748804 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"a1008d19697279cbd14e1da441adaa8c511db421d870ec17dd82ad24769c432c"} Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.005674 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-mzmt2"] Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.006912 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.021140 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-mzmt2"] Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.119691 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h78v\" (UniqueName: \"kubernetes.io/projected/60dce4ee-cdb6-4418-ac73-063f48c8be7e-kube-api-access-2h78v\") pod \"default-snmp-webhook-6856cfb745-mzmt2\" (UID: \"60dce4ee-cdb6-4418-ac73-063f48c8be7e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.221684 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h78v\" (UniqueName: \"kubernetes.io/projected/60dce4ee-cdb6-4418-ac73-063f48c8be7e-kube-api-access-2h78v\") pod \"default-snmp-webhook-6856cfb745-mzmt2\" (UID: \"60dce4ee-cdb6-4418-ac73-063f48c8be7e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.242669 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h78v\" (UniqueName: \"kubernetes.io/projected/60dce4ee-cdb6-4418-ac73-063f48c8be7e-kube-api-access-2h78v\") pod \"default-snmp-webhook-6856cfb745-mzmt2\" (UID: \"60dce4ee-cdb6-4418-ac73-063f48c8be7e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.325888 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.754311 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-mzmt2"] Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.783814 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" event={"ID":"60dce4ee-cdb6-4418-ac73-063f48c8be7e","Type":"ContainerStarted","Data":"cc4113bd613c5a9940e3f7218e5d5ed472db1ff986ad6ac7d7cdf735ebd0b61e"} Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.809559 4774 generic.go:334] "Generic (PLEG): container finished" podID="71259922-2e93-4571-80c6-e054f4372056" containerID="a1008d19697279cbd14e1da441adaa8c511db421d870ec17dd82ad24769c432c" exitCode=0 Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.809681 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerDied","Data":"a1008d19697279cbd14e1da441adaa8c511db421d870ec17dd82ad24769c432c"} Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.992603 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.994700 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.997182 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001177 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001489 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-c9qmn" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001616 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001841 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001895 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.016005 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.190475 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.190801 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191012 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-tls-assets\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191284 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-volume\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191387 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191999 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-out\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.192144 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-web-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.192270 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbq4\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-kube-api-access-dkbq4\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.293878 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294154 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294250 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294326 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-tls-assets\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.294353 4774 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294411 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-volume\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.294512 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls podName:695367d2-8c58-4bee-adcc-61c6d1b7457b nodeName:}" failed. No retries permitted until 2026-01-27 00:34:00.794484428 +0000 UTC m=+1619.100261312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "695367d2-8c58-4bee-adcc-61c6d1b7457b") : secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294545 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294575 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-out\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-web-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294659 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbq4\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-kube-api-access-dkbq4\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.299590 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.299625 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1b199c4d1d90e45fd220b5f013926ce2c98f8abb2ae028e59739279000671a16/globalmount\"" pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.299989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-tls-assets\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.300701 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.303223 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.304036 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-web-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.304140 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-out\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.315798 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbq4\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-kube-api-access-dkbq4\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.319451 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-volume\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.335467 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.801306 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.801472 4774 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.801525 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls podName:695367d2-8c58-4bee-adcc-61c6d1b7457b nodeName:}" failed. No retries permitted until 2026-01-27 00:34:01.801509227 +0000 UTC m=+1620.107286112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "695367d2-8c58-4bee-adcc-61c6d1b7457b") : secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:01 crc kubenswrapper[4774]: I0127 00:34:01.815666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:01 crc kubenswrapper[4774]: E0127 00:34:01.815882 4774 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:01 crc kubenswrapper[4774]: E0127 00:34:01.816159 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls podName:695367d2-8c58-4bee-adcc-61c6d1b7457b nodeName:}" failed. No retries permitted until 2026-01-27 00:34:03.816138722 +0000 UTC m=+1622.121915606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "695367d2-8c58-4bee-adcc-61c6d1b7457b") : secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.847761 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.855649 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.923919 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-c9qmn" Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.932460 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.356432 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:04 crc kubenswrapper[4774]: E0127 00:34:04.360678 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.488459 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.848559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"5f3f65fbf4bd90f241d356ae013fbe904b385f8470eba91cf73287baff00adde"} Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.853652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" event={"ID":"60dce4ee-cdb6-4418-ac73-063f48c8be7e","Type":"ContainerStarted","Data":"f67e26410ab3b6fdd6b6d63799b1254d45dd36e56de03df11d5c10f3ec9135fb"} Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.879846 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" podStartSLOduration=2.417712053 podStartE2EDuration="9.879818456s" podCreationTimestamp="2026-01-27 00:33:55 +0000 UTC" firstStartedPulling="2026-01-27 00:33:56.765200369 +0000 UTC m=+1615.070977253" lastFinishedPulling="2026-01-27 00:34:04.227306772 +0000 UTC m=+1622.533083656" observedRunningTime="2026-01-27 00:34:04.874044491 +0000 UTC m=+1623.179821375" watchObservedRunningTime="2026-01-27 00:34:04.879818456 +0000 UTC m=+1623.185595350" Jan 27 00:34:06 crc kubenswrapper[4774]: I0127 00:34:06.870188 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"810749b3c3ce02665b129ee8d84b5fefda2c703314683991363a3f30a39bc9ca"} Jan 27 00:34:08 crc kubenswrapper[4774]: I0127 00:34:08.891226 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"061e1a8c11e7384319c73f1ff4dd7449660819bd8e6d3e217e71ab859164fe4b"} Jan 27 00:34:11 crc kubenswrapper[4774]: I0127 00:34:11.921399 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"9206af83fa1defd5137276c13b171505de9e9b0669078414806578e2b80ea754"} Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.371118 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr"] Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.372933 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.411044 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-fv29r" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.411201 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.418303 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.418798 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr"] Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.419589 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512033 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/221c6d11-7462-4578-bbb1-a78ee6bad7c0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512141 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512176 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512360 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrwsx\" (UniqueName: \"kubernetes.io/projected/221c6d11-7462-4578-bbb1-a78ee6bad7c0-kube-api-access-mrwsx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512393 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/221c6d11-7462-4578-bbb1-a78ee6bad7c0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614269 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/221c6d11-7462-4578-bbb1-a78ee6bad7c0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614388 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614450 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrwsx\" (UniqueName: \"kubernetes.io/projected/221c6d11-7462-4578-bbb1-a78ee6bad7c0-kube-api-access-mrwsx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/221c6d11-7462-4578-bbb1-a78ee6bad7c0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: E0127 00:34:13.615146 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.615160 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/221c6d11-7462-4578-bbb1-a78ee6bad7c0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: E0127 00:34:13.615260 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls podName:221c6d11-7462-4578-bbb1-a78ee6bad7c0 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:14.11523156 +0000 UTC m=+1632.421008444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" (UID: "221c6d11-7462-4578-bbb1-a78ee6bad7c0") : secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.616366 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/221c6d11-7462-4578-bbb1-a78ee6bad7c0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.621559 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.632883 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrwsx\" (UniqueName: \"kubernetes.io/projected/221c6d11-7462-4578-bbb1-a78ee6bad7c0-kube-api-access-mrwsx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:14 crc kubenswrapper[4774]: I0127 00:34:14.122823 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:14 crc kubenswrapper[4774]: E0127 00:34:14.123181 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:14 crc kubenswrapper[4774]: E0127 00:34:14.123398 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls podName:221c6d11-7462-4578-bbb1-a78ee6bad7c0 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:15.123350638 +0000 UTC m=+1633.429127572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" (UID: "221c6d11-7462-4578-bbb1-a78ee6bad7c0") : secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.159593 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.166071 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.226771 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.477381 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr"] Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.850415 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9"] Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.852381 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.854558 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.868247 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9"] Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.854955 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977190 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977246 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2rpf\" (UniqueName: \"kubernetes.io/projected/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-kube-api-access-l2rpf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977282 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977315 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977337 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.979748 4774 generic.go:334] "Generic (PLEG): container finished" podID="695367d2-8c58-4bee-adcc-61c6d1b7457b" containerID="810749b3c3ce02665b129ee8d84b5fefda2c703314683991363a3f30a39bc9ca" exitCode=0 Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.979885 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerDied","Data":"810749b3c3ce02665b129ee8d84b5fefda2c703314683991363a3f30a39bc9ca"} Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.983352 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"cc54d308655542a0ddec060a01b614ea149c897981047fdde83598481e298a82"} Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079020 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079395 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079421 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.079240 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079504 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.079553 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls podName:d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:16.579522845 +0000 UTC m=+1634.885299729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" (UID: "d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2rpf\" (UniqueName: \"kubernetes.io/projected/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-kube-api-access-l2rpf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.080177 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.080267 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.088033 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.108341 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2rpf\" (UniqueName: \"kubernetes.io/projected/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-kube-api-access-l2rpf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.588110 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.588463 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.588623 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls podName:d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:17.588596429 +0000 UTC m=+1635.894373313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" (UID: "d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:17 crc kubenswrapper[4774]: I0127 00:34:17.605111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:17 crc kubenswrapper[4774]: I0127 00:34:17.618411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:17 crc kubenswrapper[4774]: I0127 00:34:17.739907 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.357391 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:19 crc kubenswrapper[4774]: E0127 00:34:19.357937 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.773880 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g"] Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.775989 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.778423 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.779231 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.787826 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g"] Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954000 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/cce880cf-c820-49a2-9cf3-c2161499d51f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954054 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvrs\" (UniqueName: \"kubernetes.io/projected/cce880cf-c820-49a2-9cf3-c2161499d51f-kube-api-access-wkvrs\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954115 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954242 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954427 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/cce880cf-c820-49a2-9cf3-c2161499d51f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055592 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/cce880cf-c820-49a2-9cf3-c2161499d51f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055679 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/cce880cf-c820-49a2-9cf3-c2161499d51f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055699 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvrs\" (UniqueName: \"kubernetes.io/projected/cce880cf-c820-49a2-9cf3-c2161499d51f-kube-api-access-wkvrs\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055751 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.056182 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.056276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/cce880cf-c820-49a2-9cf3-c2161499d51f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.056313 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls podName:cce880cf-c820-49a2-9cf3-c2161499d51f nodeName:}" failed. No retries permitted until 2026-01-27 00:34:20.556267677 +0000 UTC m=+1638.862044561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" (UID: "cce880cf-c820-49a2-9cf3-c2161499d51f") : secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.056462 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/cce880cf-c820-49a2-9cf3-c2161499d51f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.064605 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.074335 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvrs\" (UniqueName: \"kubernetes.io/projected/cce880cf-c820-49a2-9cf3-c2161499d51f-kube-api-access-wkvrs\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.562949 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.563172 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.563262 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls podName:cce880cf-c820-49a2-9cf3-c2161499d51f nodeName:}" failed. No retries permitted until 2026-01-27 00:34:21.563243385 +0000 UTC m=+1639.869020269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" (UID: "cce880cf-c820-49a2-9cf3-c2161499d51f") : secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:21 crc kubenswrapper[4774]: I0127 00:34:21.623164 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:21 crc kubenswrapper[4774]: I0127 00:34:21.629720 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:21 crc kubenswrapper[4774]: I0127 00:34:21.900166 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:23 crc kubenswrapper[4774]: I0127 00:34:23.852097 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g"] Jan 27 00:34:23 crc kubenswrapper[4774]: I0127 00:34:23.858584 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9"] Jan 27 00:34:23 crc kubenswrapper[4774]: W0127 00:34:23.951814 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd82a9cf9_0ba7_416d_93b4_b3bd9bc6a9f6.slice/crio-0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1 WatchSource:0}: Error finding container 0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1: Status 404 returned error can't find the container with id 0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1 Jan 27 00:34:23 crc kubenswrapper[4774]: W0127 00:34:23.954102 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce880cf_c820_49a2_9cf3_c2161499d51f.slice/crio-335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041 WatchSource:0}: Error finding container 335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041: Status 404 returned error can't find the container with id 335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041 Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.053046 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.058902 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"68ad807559137d36959c1aa7090218a544b931c8a28f63f731d68957948ac5c5"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.060247 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.062449 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"2acf1a5b21522993031d7a31e0b33469eb0fa0336dd2cd0bf363d142d45721ee"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.065182 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"fa6b17978110c0c566da456324f7019ceb843ea7b8ee0b84d9507e78d46fce1b"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.093628 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.013662005 podStartE2EDuration="40.093599956s" podCreationTimestamp="2026-01-27 00:33:44 +0000 UTC" firstStartedPulling="2026-01-27 00:33:48.285173991 +0000 UTC m=+1606.590950875" lastFinishedPulling="2026-01-27 00:34:23.365111942 +0000 UTC m=+1641.670888826" observedRunningTime="2026-01-27 00:34:24.089139716 +0000 UTC m=+1642.394916620" watchObservedRunningTime="2026-01-27 00:34:24.093599956 +0000 UTC m=+1642.399376840" Jan 27 00:34:25 crc kubenswrapper[4774]: I0127 00:34:25.101605 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"e86719840d53227a371646ced201bfdda0d466ae22ffba039461546e413783bc"} Jan 27 00:34:25 crc kubenswrapper[4774]: I0127 00:34:25.106006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.118736 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"1d5cacd63e8a2f8cf24e618cade0bc5f4b296dad05817cd41c06295a2aba6d65"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.122260 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.125637 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.125699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"79fc62c1835dfc91d81c5d2ece1e2569c8514a410294d2bda05be96d3ba44d6f"} Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.652302 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8"] Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.653758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.656973 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.677056 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.678301 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8"] Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742151 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7jpz\" (UniqueName: \"kubernetes.io/projected/40a1ab63-3b75-4f20-a692-58d80ccd2847-kube-api-access-h7jpz\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742228 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/40a1ab63-3b75-4f20-a692-58d80ccd2847-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742262 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/40a1ab63-3b75-4f20-a692-58d80ccd2847-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742374 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/40a1ab63-3b75-4f20-a692-58d80ccd2847-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844263 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/40a1ab63-3b75-4f20-a692-58d80ccd2847-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844330 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/40a1ab63-3b75-4f20-a692-58d80ccd2847-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844422 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/40a1ab63-3b75-4f20-a692-58d80ccd2847-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844483 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7jpz\" (UniqueName: \"kubernetes.io/projected/40a1ab63-3b75-4f20-a692-58d80ccd2847-kube-api-access-h7jpz\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.845429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/40a1ab63-3b75-4f20-a692-58d80ccd2847-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.845760 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/40a1ab63-3b75-4f20-a692-58d80ccd2847-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.853507 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/40a1ab63-3b75-4f20-a692-58d80ccd2847-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.867888 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7jpz\" (UniqueName: \"kubernetes.io/projected/40a1ab63-3b75-4f20-a692-58d80ccd2847-kube-api-access-h7jpz\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.955660 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.978085 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.585624 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp"] Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.586910 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.590327 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.609028 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp"] Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.655740 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd77\" (UniqueName: \"kubernetes.io/projected/78320a5f-ee7d-4190-8af3-9d9609bcf111-kube-api-access-rvd77\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.655878 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/78320a5f-ee7d-4190-8af3-9d9609bcf111-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.655917 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/78320a5f-ee7d-4190-8af3-9d9609bcf111-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.656015 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/78320a5f-ee7d-4190-8af3-9d9609bcf111-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.758751 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd77\" (UniqueName: \"kubernetes.io/projected/78320a5f-ee7d-4190-8af3-9d9609bcf111-kube-api-access-rvd77\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.758895 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/78320a5f-ee7d-4190-8af3-9d9609bcf111-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.758944 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/78320a5f-ee7d-4190-8af3-9d9609bcf111-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.759008 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/78320a5f-ee7d-4190-8af3-9d9609bcf111-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.760157 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/78320a5f-ee7d-4190-8af3-9d9609bcf111-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.760539 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/78320a5f-ee7d-4190-8af3-9d9609bcf111-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.782105 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/78320a5f-ee7d-4190-8af3-9d9609bcf111-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.786894 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd77\" (UniqueName: \"kubernetes.io/projected/78320a5f-ee7d-4190-8af3-9d9609bcf111-kube-api-access-rvd77\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.909212 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:29 crc kubenswrapper[4774]: I0127 00:34:29.733927 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8"] Jan 27 00:34:29 crc kubenswrapper[4774]: W0127 00:34:29.736896 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40a1ab63_3b75_4f20_a692_58d80ccd2847.slice/crio-35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0 WatchSource:0}: Error finding container 35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0: Status 404 returned error can't find the container with id 35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0 Jan 27 00:34:29 crc kubenswrapper[4774]: I0127 00:34:29.797115 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp"] Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.161350 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.161403 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.163953 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"723dcc4f289a2404c75f71d52ad8aba594be567de1c9fa4614b1de0a72fb3740"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.174130 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"c960e0c21bcfc444a274889c5d7a1571cb17081511a03b672c22fd296c6b3d7e"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.190829 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"4c760483fbd2a1a7acf7ac960a78224bab5e8dc9e4bb33d2f80641b2bd1befaa"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.193083 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" podStartSLOduration=10.046081442 podStartE2EDuration="15.193052716s" podCreationTimestamp="2026-01-27 00:34:15 +0000 UTC" firstStartedPulling="2026-01-27 00:34:24.312625435 +0000 UTC m=+1642.618402319" lastFinishedPulling="2026-01-27 00:34:29.459596699 +0000 UTC m=+1647.765373593" observedRunningTime="2026-01-27 00:34:30.184996139 +0000 UTC m=+1648.490773043" watchObservedRunningTime="2026-01-27 00:34:30.193052716 +0000 UTC m=+1648.498829600" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.219188 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"0682319276b5bd880656870f2652514e216a14cbef4bb9526e80e68c3ca94bad"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.230197 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.230255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"02c4ce06168db4f170d105ebb8fa3c2ff2b4beb60ee2ef582f3a1e8457ac7cca"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.234677 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=18.705891366 podStartE2EDuration="32.23458495s" podCreationTimestamp="2026-01-27 00:33:58 +0000 UTC" firstStartedPulling="2026-01-27 00:34:15.982630203 +0000 UTC m=+1634.288407087" lastFinishedPulling="2026-01-27 00:34:29.511323787 +0000 UTC m=+1647.817100671" observedRunningTime="2026-01-27 00:34:30.218365845 +0000 UTC m=+1648.524142729" watchObservedRunningTime="2026-01-27 00:34:30.23458495 +0000 UTC m=+1648.540361834" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.269199 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" podStartSLOduration=6.071436872 podStartE2EDuration="11.269168219s" podCreationTimestamp="2026-01-27 00:34:19 +0000 UTC" firstStartedPulling="2026-01-27 00:34:24.31208847 +0000 UTC m=+1642.617865364" lastFinishedPulling="2026-01-27 00:34:29.509819817 +0000 UTC m=+1647.815596711" observedRunningTime="2026-01-27 00:34:30.247128577 +0000 UTC m=+1648.552905451" watchObservedRunningTime="2026-01-27 00:34:30.269168219 +0000 UTC m=+1648.574945103" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.269755 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" podStartSLOduration=3.253443765 podStartE2EDuration="17.269746735s" podCreationTimestamp="2026-01-27 00:34:13 +0000 UTC" firstStartedPulling="2026-01-27 00:34:15.491542403 +0000 UTC m=+1633.797319277" lastFinishedPulling="2026-01-27 00:34:29.507845363 +0000 UTC m=+1647.813622247" observedRunningTime="2026-01-27 00:34:30.267617637 +0000 UTC m=+1648.573394541" watchObservedRunningTime="2026-01-27 00:34:30.269746735 +0000 UTC m=+1648.575523609" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.361827 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:30 crc kubenswrapper[4774]: E0127 00:34:30.363374 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.238844 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"3ecdf66505bc29c0f15176c86ae97d1090bd079768c2c1e5209c055cae244ec2"} Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.242021 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"8c54909466989ae687c06c9a207a5ec93894db780381f2df64a8096664a167cf"} Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.289268 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" podStartSLOduration=2.943343535 podStartE2EDuration="3.28924346s" podCreationTimestamp="2026-01-27 00:34:28 +0000 UTC" firstStartedPulling="2026-01-27 00:34:29.827439672 +0000 UTC m=+1648.133216566" lastFinishedPulling="2026-01-27 00:34:30.173339587 +0000 UTC m=+1648.479116491" observedRunningTime="2026-01-27 00:34:31.287688268 +0000 UTC m=+1649.593465152" watchObservedRunningTime="2026-01-27 00:34:31.28924346 +0000 UTC m=+1649.595020344" Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.294308 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" podStartSLOduration=3.8625214850000003 podStartE2EDuration="4.294297355s" podCreationTimestamp="2026-01-27 00:34:27 +0000 UTC" firstStartedPulling="2026-01-27 00:34:29.744517296 +0000 UTC m=+1648.050294180" lastFinishedPulling="2026-01-27 00:34:30.176293156 +0000 UTC m=+1648.482070050" observedRunningTime="2026-01-27 00:34:31.26320003 +0000 UTC m=+1649.568976914" watchObservedRunningTime="2026-01-27 00:34:31.294297355 +0000 UTC m=+1649.600074239" Jan 27 00:34:32 crc kubenswrapper[4774]: I0127 00:34:32.955876 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:33 crc kubenswrapper[4774]: I0127 00:34:33.017996 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:33 crc kubenswrapper[4774]: I0127 00:34:33.320647 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:41 crc kubenswrapper[4774]: I0127 00:34:41.485975 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:34:41 crc kubenswrapper[4774]: I0127 00:34:41.486912 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" containerID="cri-o://ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" gracePeriod=30 Jan 27 00:34:41 crc kubenswrapper[4774]: I0127 00:34:41.890611 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.004588 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005188 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005265 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005348 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005383 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005415 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005439 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.007003 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.012803 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.013242 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w" (OuterVolumeSpecName: "kube-api-access-qdh7w") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "kube-api-access-qdh7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.013372 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.014063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.014979 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.029121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.108949 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109021 4774 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109050 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109074 4774 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109094 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109114 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109137 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.348619 4774 generic.go:334] "Generic (PLEG): container finished" podID="d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6" containerID="4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.349205 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerDied","Data":"4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350763 4774 generic.go:334] "Generic (PLEG): container finished" podID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350839 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350828 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerDied","Data":"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350979 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerDied","Data":"a5ba257ff6bd2d15abb20ec26902d31e3cb2085f1429dfba58273e45e14cd88c"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.351013 4774 scope.go:117] "RemoveContainer" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.353990 4774 scope.go:117] "RemoveContainer" containerID="4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.354060 4774 generic.go:334] "Generic (PLEG): container finished" podID="221c6d11-7462-4578-bbb1-a78ee6bad7c0" containerID="a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.354129 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerDied","Data":"a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.355025 4774 scope.go:117] "RemoveContainer" containerID="a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.361493 4774 generic.go:334] "Generic (PLEG): container finished" podID="78320a5f-ee7d-4190-8af3-9d9609bcf111" containerID="74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.373665 4774 generic.go:334] "Generic (PLEG): container finished" podID="40a1ab63-3b75-4f20-a692-58d80ccd2847" containerID="58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.384244 4774 scope.go:117] "RemoveContainer" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" Jan 27 00:34:42 crc kubenswrapper[4774]: E0127 00:34:42.390423 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f\": container with ID starting with ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f not found: ID does not exist" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.390494 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f"} err="failed to get container status \"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f\": rpc error: code = NotFound desc = could not find container \"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f\": container with ID starting with ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f not found: ID does not exist" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.415261 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerDied","Data":"74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.415346 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerDied","Data":"58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.416147 4774 scope.go:117] "RemoveContainer" containerID="58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.423197 4774 scope.go:117] "RemoveContainer" containerID="74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.493028 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.501134 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.086413 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rjwl8"] Jan 27 00:34:43 crc kubenswrapper[4774]: E0127 00:34:43.087814 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.087952 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.088218 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.088991 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.091851 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092269 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092468 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092600 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092723 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-cdfh6" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.093433 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.093822 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.110529 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rjwl8"] Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148265 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dq2z\" (UniqueName: \"kubernetes.io/projected/2480c4b1-c406-412e-9258-a94358f2c1c1-kube-api-access-6dq2z\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148342 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148374 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-users\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148417 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148438 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-config\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249483 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249550 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-users\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249614 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-config\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249671 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249708 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dq2z\" (UniqueName: \"kubernetes.io/projected/2480c4b1-c406-412e-9258-a94358f2c1c1-kube-api-access-6dq2z\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.251894 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-config\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.263625 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-users\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.264832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.264879 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.264891 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.268760 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.270581 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dq2z\" (UniqueName: \"kubernetes.io/projected/2480c4b1-c406-412e-9258-a94358f2c1c1-kube-api-access-6dq2z\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.384601 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.387774 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.390556 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.393931 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.397127 4774 generic.go:334] "Generic (PLEG): container finished" podID="cce880cf-c820-49a2-9cf3-c2161499d51f" containerID="2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15" exitCode=0 Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.397224 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerDied","Data":"2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.399400 4774 scope.go:117] "RemoveContainer" containerID="2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.406430 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.782896 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rjwl8"] Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.033925 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.034904 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.038429 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.038544 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.043282 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.171341 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/fe039606-88c5-414d-b20c-a129a4bb0782-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.171400 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/fe039606-88c5-414d-b20c-a129a4bb0782-qdr-test-config\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.171423 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkm2p\" (UniqueName: \"kubernetes.io/projected/fe039606-88c5-414d-b20c-a129a4bb0782-kube-api-access-qkm2p\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.272657 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/fe039606-88c5-414d-b20c-a129a4bb0782-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.272749 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkm2p\" (UniqueName: \"kubernetes.io/projected/fe039606-88c5-414d-b20c-a129a4bb0782-kube-api-access-qkm2p\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.272773 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/fe039606-88c5-414d-b20c-a129a4bb0782-qdr-test-config\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.273737 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/fe039606-88c5-414d-b20c-a129a4bb0782-qdr-test-config\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.280418 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/fe039606-88c5-414d-b20c-a129a4bb0782-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.293526 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkm2p\" (UniqueName: \"kubernetes.io/projected/fe039606-88c5-414d-b20c-a129a4bb0782-kube-api-access-qkm2p\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.366491 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" path="/var/lib/kubelet/pods/2c8f3edd-38d2-4a50-add2-873dd1ac35e5/volumes" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.368658 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.415161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" event={"ID":"2480c4b1-c406-412e-9258-a94358f2c1c1","Type":"ContainerStarted","Data":"19d79350a136be565c6857d0a87082d9b9549bc42dffec45f34fa0037b1f5da4"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.415219 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" event={"ID":"2480c4b1-c406-412e-9258-a94358f2c1c1","Type":"ContainerStarted","Data":"1ca32d8536934458b3214a3be428cba7ebec47814fa35d583e204f1dd4a4955e"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.421410 4774 generic.go:334] "Generic (PLEG): container finished" podID="d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6" containerID="b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.421482 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerDied","Data":"b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.421524 4774 scope.go:117] "RemoveContainer" containerID="4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.422085 4774 scope.go:117] "RemoveContainer" containerID="b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.422280 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9_service-telemetry(d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" podUID="d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.426911 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.443697 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" podStartSLOduration=3.443673326 podStartE2EDuration="3.443673326s" podCreationTimestamp="2026-01-27 00:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:34:44.436353339 +0000 UTC m=+1662.742130223" watchObservedRunningTime="2026-01-27 00:34:44.443673326 +0000 UTC m=+1662.749450210" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.446012 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerDied","Data":"d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.446026 4774 generic.go:334] "Generic (PLEG): container finished" podID="221c6d11-7462-4578-bbb1-a78ee6bad7c0" containerID="d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.446780 4774 scope.go:117] "RemoveContainer" containerID="d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.447166 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr_service-telemetry(221c6d11-7462-4578-bbb1-a78ee6bad7c0)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" podUID="221c6d11-7462-4578-bbb1-a78ee6bad7c0" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.481441 4774 generic.go:334] "Generic (PLEG): container finished" podID="78320a5f-ee7d-4190-8af3-9d9609bcf111" containerID="3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.481484 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerDied","Data":"3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.484011 4774 scope.go:117] "RemoveContainer" containerID="3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.484258 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp_service-telemetry(78320a5f-ee7d-4190-8af3-9d9609bcf111)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" podUID="78320a5f-ee7d-4190-8af3-9d9609bcf111" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.496329 4774 scope.go:117] "RemoveContainer" containerID="a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.504352 4774 generic.go:334] "Generic (PLEG): container finished" podID="40a1ab63-3b75-4f20-a692-58d80ccd2847" containerID="de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.504415 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerDied","Data":"de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.505134 4774 scope.go:117] "RemoveContainer" containerID="de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.505386 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-564958f777-cn8q8_service-telemetry(40a1ab63-3b75-4f20-a692-58d80ccd2847)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" podUID="40a1ab63-3b75-4f20-a692-58d80ccd2847" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.538715 4774 scope.go:117] "RemoveContainer" containerID="74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.607028 4774 scope.go:117] "RemoveContainer" containerID="58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.911967 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Jan 27 00:34:44 crc kubenswrapper[4774]: W0127 00:34:44.916136 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe039606_88c5_414d_b20c_a129a4bb0782.slice/crio-a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc WatchSource:0}: Error finding container a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc: Status 404 returned error can't find the container with id a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.356417 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:45 crc kubenswrapper[4774]: E0127 00:34:45.356883 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.535920 4774 generic.go:334] "Generic (PLEG): container finished" podID="cce880cf-c820-49a2-9cf3-c2161499d51f" containerID="eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745" exitCode=0 Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.536431 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerDied","Data":"eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745"} Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.536484 4774 scope.go:117] "RemoveContainer" containerID="2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15" Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.537239 4774 scope.go:117] "RemoveContainer" containerID="eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745" Jan 27 00:34:45 crc kubenswrapper[4774]: E0127 00:34:45.537515 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g_service-telemetry(cce880cf-c820-49a2-9cf3-c2161499d51f)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" podUID="cce880cf-c820-49a2-9cf3-c2161499d51f" Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.631769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"fe039606-88c5-414d-b20c-a129a4bb0782","Type":"ContainerStarted","Data":"a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc"} Jan 27 00:34:55 crc kubenswrapper[4774]: I0127 00:34:55.358270 4774 scope.go:117] "RemoveContainer" containerID="b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf" Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.248442 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.357021 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:56 crc kubenswrapper[4774]: E0127 00:34:56.357281 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.357731 4774 scope.go:117] "RemoveContainer" containerID="eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745" Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.763806 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"f8f10b196b67872bb3419b950227b872761dd478b9fdeeb86f5c83fb84943e1b"} Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.766640 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"fe039606-88c5-414d-b20c-a129a4bb0782","Type":"ContainerStarted","Data":"2e8f6a82588628633e3da4cb06eb019f4f21a9f6756ee608368c7172acae123e"} Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.771078 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"c483661aab70268feeb0ad1f4b7f9fae9b4ed2ad388c4936ecacbdc77485c2fb"} Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.810563 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.383996196 podStartE2EDuration="13.810526763s" podCreationTimestamp="2026-01-27 00:34:43 +0000 UTC" firstStartedPulling="2026-01-27 00:34:44.91919625 +0000 UTC m=+1663.224973134" lastFinishedPulling="2026-01-27 00:34:56.345726817 +0000 UTC m=+1674.651503701" observedRunningTime="2026-01-27 00:34:56.80928134 +0000 UTC m=+1675.115058224" watchObservedRunningTime="2026-01-27 00:34:56.810526763 +0000 UTC m=+1675.116303637" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.145305 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mjtw7"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.146627 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.150494 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.150808 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.152643 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.153021 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.154097 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.155006 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.173667 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mjtw7"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240272 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240348 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240378 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240477 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240534 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240566 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240602 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.341641 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342112 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342155 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342187 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.343202 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.343470 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344000 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342794 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344112 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344159 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344634 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344959 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.357190 4774 scope.go:117] "RemoveContainer" containerID="3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.379056 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.461183 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.542139 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.543030 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.551997 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.656959 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"curl\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.760790 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"curl\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.816515 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"curl\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.833201 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"a1875f622f815691b08bc79fca524319649ee79ff21f7659b380e66e7331878b"} Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.912873 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.997406 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mjtw7"] Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.149093 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.357026 4774 scope.go:117] "RemoveContainer" containerID="de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30" Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.358273 4774 scope.go:117] "RemoveContainer" containerID="d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9" Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.841078 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerStarted","Data":"a4b45d7c5256711b20005d0e8d1e762fe1d54612eb9b855eb186ad80c6d704c0"} Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.842332 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerStarted","Data":"a4b4800a9182a2cede7edf0bba16cdfeda5b569ebf0489b187dc1fbb7d2dca40"} Jan 27 00:34:59 crc kubenswrapper[4774]: I0127 00:34:59.869417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"cf05f6a90b463ca3e2066dfcaf8c4a4785856be0854c2186812ee45208a925fc"} Jan 27 00:34:59 crc kubenswrapper[4774]: I0127 00:34:59.872491 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"ea994568bce6d862e732b433ee25ed08324f97f3a871d2bfcb19e954a1bd5899"} Jan 27 00:35:00 crc kubenswrapper[4774]: I0127 00:35:00.883195 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerStarted","Data":"18a2e3275099a59d98ea59bf9877de771dc72d4995929572319101ea7c183b79"} Jan 27 00:35:00 crc kubenswrapper[4774]: I0127 00:35:00.900362 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/curl" podStartSLOduration=1.399244628 podStartE2EDuration="3.900338351s" podCreationTimestamp="2026-01-27 00:34:57 +0000 UTC" firstStartedPulling="2026-01-27 00:34:58.159750039 +0000 UTC m=+1676.465526923" lastFinishedPulling="2026-01-27 00:35:00.660843752 +0000 UTC m=+1678.966620646" observedRunningTime="2026-01-27 00:35:00.894467433 +0000 UTC m=+1679.200244337" watchObservedRunningTime="2026-01-27 00:35:00.900338351 +0000 UTC m=+1679.206115235" Jan 27 00:35:01 crc kubenswrapper[4774]: I0127 00:35:01.901396 4774 generic.go:334] "Generic (PLEG): container finished" podID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerID="18a2e3275099a59d98ea59bf9877de771dc72d4995929572319101ea7c183b79" exitCode=0 Jan 27 00:35:01 crc kubenswrapper[4774]: I0127 00:35:01.901572 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerDied","Data":"18a2e3275099a59d98ea59bf9877de771dc72d4995929572319101ea7c183b79"} Jan 27 00:35:11 crc kubenswrapper[4774]: I0127 00:35:11.357887 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:11 crc kubenswrapper[4774]: E0127 00:35:11.358422 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:13 crc kubenswrapper[4774]: E0127 00:35:13.495085 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/tripleomastercentos9/openstack-collectd:current-tripleo" Jan 27 00:35:13 crc kubenswrapper[4774]: E0127 00:35:13.496646 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:smoketest-collectd,Image:quay.io/tripleomastercentos9/openstack-collectd:current-tripleo,Command:[/smoketest_collectd_entrypoint.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLOUDNAME,Value:smoke1,ValueFrom:nil,},EnvVar{Name:ELASTICSEARCH_AUTH_PASS,Value:TRctxSF5yjVx7wM13DX9sDEj,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_AUTH_TOKEN,Value:eyJhbGciOiJSUzI1NiIsImtpZCI6InF6SnFxNFFjbVk5VmJQZ2dNMmUxdHFmTlJlVWx4UDhSTlhIamV3RUx4WU0ifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjIl0sImV4cCI6MTc2OTQ3NzY3OSwiaWF0IjoxNzY5NDc0MDc5LCJpc3MiOiJodHRwczovL2t1YmVybmV0ZXMuZGVmYXVsdC5zdmMiLCJqdGkiOiI3Y2E2ODBiNi1iNWVhLTQyNTgtYjQwZi0xMjhkZjkyM2NhMzAiLCJrdWJlcm5ldGVzLmlvIjp7Im5hbWVzcGFjZSI6InNlcnZpY2UtdGVsZW1ldHJ5Iiwic2VydmljZWFjY291bnQiOnsibmFtZSI6InN0Zi1wcm9tZXRoZXVzLXJlYWRlciIsInVpZCI6IjVlODU2Y2JlLWRhMWUtNGFkZS05Yjg0LWE1ZWJlMzg1Njk0ZSJ9fSwibmJmIjoxNzY5NDc0MDc5LCJzdWIiOiJzeXN0ZW06c2VydmljZWFjY291bnQ6c2VydmljZS10ZWxlbWV0cnk6c3RmLXByb21ldGhldXMtcmVhZGVyIn0.ldNLwti5zqlucyXu-cOzkivSmAZvjduc3ztWpwB_lYEjhgfKJZyEQuCfg3o_RhksF0LNMiXmNWuTcDqcuiJLZhxWYeyI2Nb2Ved0AkZvOroH2Pntmd4T_iy319JZxx-34yJ7cBo1sErdxeVbr63jrYT0HVzyIi3Xpq7XNMGIwlumW8iVc8E5b_7w2DfMl-gv7gLr2GTY5PO8fE-4WKBGLkKh10GPXVxBL5ToyvXfMjM78N618MjQfmONl3hiIEY023gyLM4HK7PUWF-VBgxA4ba1NVPsQGl2dNFBf2B2xJNvT19U85NmW_gbTa5Dst41w5SJBrUEGVp42Ne1pXoqAqOnSla3caDcMugh2dHXnP4BsOV14T6PhtwcDR8aeQUibiUqGI23ONmG3JXMwR80zV9rvQvFOSCDKU8bUpaY9XIqEJjIBPZ4VLg22tMT2egqspnhl0dnO925MTESSXXgZfJiMBPhPJij4RQyPgNQzkgUJNSDoxrjpgl4h2nMN6CNhws8t8mV4uooc9qxVum1cUdccCeNjBW2stZ0gOatw1LkPtwXacIS1GMTemTVm9dCnNJNza2Ov6LM0mAgQUz6r1_atHH70pZYwHXP1W4K2oxgw3aP9-ivcnFo3GSSbrPwkuO5LTi_OJt41YU5vNMN67aUKgjo_ikedOKtqB9sn5o,ValueFrom:nil,},EnvVar{Name:OBSERVABILITY_STRATEGY,Value:<>,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:collectd-config,ReadOnly:false,MountPath:/etc/minimal-collectd.conf.template,SubPath:minimal-collectd.conf.template,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sensubility-config,ReadOnly:false,MountPath:/etc/collectd-sensubility.conf,SubPath:collectd-sensubility.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:healthcheck-log,ReadOnly:false,MountPath:/healthcheck.log,SubPath:healthcheck.log,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:collectd-entrypoint-script,ReadOnly:false,MountPath:/smoketest_collectd_entrypoint.sh,SubPath:smoketest_collectd_entrypoint.sh,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9pjvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod stf-smoketest-smoke1-mjtw7_service-telemetry(e41bad20-a43c-4561-aefa-15b6cdfb715b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.510773 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.564355 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"8b27e0f1-7f20-4c92-b279-4a642379fc41\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.576140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz" (OuterVolumeSpecName: "kube-api-access-fkpbz") pod "8b27e0f1-7f20-4c92-b279-4a642379fc41" (UID: "8b27e0f1-7f20-4c92-b279-4a642379fc41"). InnerVolumeSpecName "kube-api-access-fkpbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.668484 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") on node \"crc\" DevicePath \"\"" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.683748 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_8b27e0f1-7f20-4c92-b279-4a642379fc41/curl/0.log" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.974069 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-mzmt2_60dce4ee-cdb6-4418-ac73-063f48c8be7e/prometheus-webhook-snmp/0.log" Jan 27 00:35:14 crc kubenswrapper[4774]: I0127 00:35:14.009675 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerDied","Data":"a4b4800a9182a2cede7edf0bba16cdfeda5b569ebf0489b187dc1fbb7d2dca40"} Jan 27 00:35:14 crc kubenswrapper[4774]: I0127 00:35:14.009716 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b4800a9182a2cede7edf0bba16cdfeda5b569ebf0489b187dc1fbb7d2dca40" Jan 27 00:35:14 crc kubenswrapper[4774]: I0127 00:35:14.009768 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:35:16 crc kubenswrapper[4774]: I0127 00:35:16.484840 4774 scope.go:117] "RemoveContainer" containerID="4f2d7297d4d28dd848f59b341d688f7920adc8f241b7e1da6ad4079f86054ae7" Jan 27 00:35:17 crc kubenswrapper[4774]: I0127 00:35:17.993389 4774 scope.go:117] "RemoveContainer" containerID="1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955" Jan 27 00:35:20 crc kubenswrapper[4774]: E0127 00:35:20.672922 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" Jan 27 00:35:21 crc kubenswrapper[4774]: I0127 00:35:21.087446 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerStarted","Data":"65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8"} Jan 27 00:35:21 crc kubenswrapper[4774]: E0127 00:35:21.088979 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleomastercentos9/openstack-collectd:current-tripleo\\\"\"" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" Jan 27 00:35:22 crc kubenswrapper[4774]: E0127 00:35:22.098473 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleomastercentos9/openstack-collectd:current-tripleo\\\"\"" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" Jan 27 00:35:25 crc kubenswrapper[4774]: I0127 00:35:25.357270 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:25 crc kubenswrapper[4774]: E0127 00:35:25.358066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:38 crc kubenswrapper[4774]: I0127 00:35:38.255625 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerStarted","Data":"612cbdd3efc1246a6024e827473fb19920043fe52ca7f85b7878afcf61cf927f"} Jan 27 00:35:38 crc kubenswrapper[4774]: I0127 00:35:38.356875 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:38 crc kubenswrapper[4774]: E0127 00:35:38.357218 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:44 crc kubenswrapper[4774]: I0127 00:35:44.148186 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-mzmt2_60dce4ee-cdb6-4418-ac73-063f48c8be7e/prometheus-webhook-snmp/0.log" Jan 27 00:35:49 crc kubenswrapper[4774]: I0127 00:35:49.357440 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:49 crc kubenswrapper[4774]: E0127 00:35:49.358204 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:52 crc kubenswrapper[4774]: I0127 00:35:52.389377 4774 generic.go:334] "Generic (PLEG): container finished" podID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerID="65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8" exitCode=0 Jan 27 00:35:52 crc kubenswrapper[4774]: I0127 00:35:52.389445 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerDied","Data":"65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8"} Jan 27 00:35:52 crc kubenswrapper[4774]: I0127 00:35:52.390211 4774 scope.go:117] "RemoveContainer" containerID="65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8" Jan 27 00:36:03 crc kubenswrapper[4774]: I0127 00:36:03.356443 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:03 crc kubenswrapper[4774]: E0127 00:36:03.357558 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:12 crc kubenswrapper[4774]: I0127 00:36:12.564220 4774 generic.go:334] "Generic (PLEG): container finished" podID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerID="612cbdd3efc1246a6024e827473fb19920043fe52ca7f85b7878afcf61cf927f" exitCode=0 Jan 27 00:36:12 crc kubenswrapper[4774]: I0127 00:36:12.564284 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerDied","Data":"612cbdd3efc1246a6024e827473fb19920043fe52ca7f85b7878afcf61cf927f"} Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.819134 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986676 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986748 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986790 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986816 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986844 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986918 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986947 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.992555 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq" (OuterVolumeSpecName: "kube-api-access-9pjvq") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "kube-api-access-9pjvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.005101 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.007632 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.008333 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.010479 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.013783 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.027075 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088090 4774 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088123 4774 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088132 4774 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088140 4774 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088148 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088157 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088166 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.580405 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerDied","Data":"a4b45d7c5256711b20005d0e8d1e762fe1d54612eb9b855eb186ad80c6d704c0"} Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.580450 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b45d7c5256711b20005d0e8d1e762fe1d54612eb9b855eb186ad80c6d704c0" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.580537 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:36:15 crc kubenswrapper[4774]: I0127 00:36:15.769228 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-mjtw7_e41bad20-a43c-4561-aefa-15b6cdfb715b/smoketest-collectd/0.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.023575 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-mjtw7_e41bad20-a43c-4561-aefa-15b6cdfb715b/smoketest-ceilometer/0.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.293171 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-rjwl8_2480c4b1-c406-412e-9258-a94358f2c1c1/default-interconnect/0.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.551363 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr_221c6d11-7462-4578-bbb1-a78ee6bad7c0/bridge/2.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.797063 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr_221c6d11-7462-4578-bbb1-a78ee6bad7c0/sg-core/0.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.054228 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-564958f777-cn8q8_40a1ab63-3b75-4f20-a692-58d80ccd2847/bridge/2.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.303383 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-564958f777-cn8q8_40a1ab63-3b75-4f20-a692-58d80ccd2847/sg-core/0.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.356443 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:17 crc kubenswrapper[4774]: E0127 00:36:17.356674 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.551663 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9_d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6/bridge/2.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.796911 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9_d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6/sg-core/0.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.110736 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp_78320a5f-ee7d-4190-8af3-9d9609bcf111/bridge/2.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.366825 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp_78320a5f-ee7d-4190-8af3-9d9609bcf111/sg-core/0.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.642471 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g_cce880cf-c820-49a2-9cf3-c2161499d51f/bridge/2.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.897256 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g_cce880cf-c820-49a2-9cf3-c2161499d51f/sg-core/0.log" Jan 27 00:36:22 crc kubenswrapper[4774]: I0127 00:36:22.636071 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7b4c7b595f-78n7f_61fa30c6-90bc-4b6c-b850-2b2e59506e08/operator/0.log" Jan 27 00:36:22 crc kubenswrapper[4774]: I0127 00:36:22.928143 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_71259922-2e93-4571-80c6-e054f4372056/prometheus/0.log" Jan 27 00:36:23 crc kubenswrapper[4774]: I0127 00:36:23.219662 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_b4f320b3-4c9f-433a-943a-8f2934061b87/elasticsearch/0.log" Jan 27 00:36:23 crc kubenswrapper[4774]: I0127 00:36:23.543364 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-mzmt2_60dce4ee-cdb6-4418-ac73-063f48c8be7e/prometheus-webhook-snmp/0.log" Jan 27 00:36:23 crc kubenswrapper[4774]: I0127 00:36:23.835376 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_695367d2-8c58-4bee-adcc-61c6d1b7457b/alertmanager/0.log" Jan 27 00:36:31 crc kubenswrapper[4774]: I0127 00:36:31.357852 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:31 crc kubenswrapper[4774]: E0127 00:36:31.358961 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:40 crc kubenswrapper[4774]: I0127 00:36:40.112788 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-59f5866557-f7gmp_14f2c5e3-3625-4889-97e4-38820ac84518/operator/0.log" Jan 27 00:36:44 crc kubenswrapper[4774]: I0127 00:36:44.679318 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7b4c7b595f-78n7f_61fa30c6-90bc-4b6c-b850-2b2e59506e08/operator/0.log" Jan 27 00:36:44 crc kubenswrapper[4774]: I0127 00:36:44.969000 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_fe039606-88c5-414d-b20c-a129a4bb0782/qdr/0.log" Jan 27 00:36:46 crc kubenswrapper[4774]: I0127 00:36:46.357580 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:46 crc kubenswrapper[4774]: E0127 00:36:46.358418 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.564889 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:36:50 crc kubenswrapper[4774]: E0127 00:36:50.566905 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-ceilometer" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.566990 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-ceilometer" Jan 27 00:36:50 crc kubenswrapper[4774]: E0127 00:36:50.567066 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-collectd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567124 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-collectd" Jan 27 00:36:50 crc kubenswrapper[4774]: E0127 00:36:50.567191 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerName="curl" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567250 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerName="curl" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567444 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-collectd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567523 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-ceilometer" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567585 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerName="curl" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.568557 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.580901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.619385 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.619469 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.619502 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721009 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721650 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.747378 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.887094 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.168632 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.902790 4774 generic.go:334] "Generic (PLEG): container finished" podID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" exitCode=0 Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.902899 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9"} Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.902987 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerStarted","Data":"8b7d6c58c285061374502290ab42c6ad81f37ce1c161d4556ef119eed99d9371"} Jan 27 00:36:52 crc kubenswrapper[4774]: I0127 00:36:52.913830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerStarted","Data":"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a"} Jan 27 00:36:53 crc kubenswrapper[4774]: I0127 00:36:53.929752 4774 generic.go:334] "Generic (PLEG): container finished" podID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" exitCode=0 Jan 27 00:36:53 crc kubenswrapper[4774]: I0127 00:36:53.929840 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a"} Jan 27 00:36:54 crc kubenswrapper[4774]: I0127 00:36:54.940192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerStarted","Data":"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40"} Jan 27 00:36:54 crc kubenswrapper[4774]: I0127 00:36:54.964821 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lz7gd" podStartSLOduration=2.50977763 podStartE2EDuration="4.964803386s" podCreationTimestamp="2026-01-27 00:36:50 +0000 UTC" firstStartedPulling="2026-01-27 00:36:51.9043642 +0000 UTC m=+1790.210141084" lastFinishedPulling="2026-01-27 00:36:54.359389926 +0000 UTC m=+1792.665166840" observedRunningTime="2026-01-27 00:36:54.959358261 +0000 UTC m=+1793.265135155" watchObservedRunningTime="2026-01-27 00:36:54.964803386 +0000 UTC m=+1793.270580270" Jan 27 00:36:59 crc kubenswrapper[4774]: I0127 00:36:59.356830 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:59 crc kubenswrapper[4774]: E0127 00:36:59.357417 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:00 crc kubenswrapper[4774]: I0127 00:37:00.887952 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:00 crc kubenswrapper[4774]: I0127 00:37:00.888444 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:00 crc kubenswrapper[4774]: I0127 00:37:00.974010 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:01 crc kubenswrapper[4774]: I0127 00:37:01.087757 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:02 crc kubenswrapper[4774]: I0127 00:37:02.166355 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.034599 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lz7gd" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" containerID="cri-o://468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" gracePeriod=2 Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.532031 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.666227 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.666901 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.667065 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.667799 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities" (OuterVolumeSpecName: "utilities") pod "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" (UID: "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.679014 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26" (OuterVolumeSpecName: "kube-api-access-gbb26") pod "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" (UID: "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3"). InnerVolumeSpecName "kube-api-access-gbb26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.742378 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" (UID: "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.768770 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") on node \"crc\" DevicePath \"\"" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.768824 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.768843 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.049928 4774 generic.go:334] "Generic (PLEG): container finished" podID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" exitCode=0 Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050046 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40"} Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050097 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"8b7d6c58c285061374502290ab42c6ad81f37ce1c161d4556ef119eed99d9371"} Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050135 4774 scope.go:117] "RemoveContainer" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050350 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.087205 4774 scope.go:117] "RemoveContainer" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.119107 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.123308 4774 scope.go:117] "RemoveContainer" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.131345 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.153562 4774 scope.go:117] "RemoveContainer" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" Jan 27 00:37:04 crc kubenswrapper[4774]: E0127 00:37:04.154473 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40\": container with ID starting with 468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40 not found: ID does not exist" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.154583 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40"} err="failed to get container status \"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40\": rpc error: code = NotFound desc = could not find container \"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40\": container with ID starting with 468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40 not found: ID does not exist" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.154623 4774 scope.go:117] "RemoveContainer" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" Jan 27 00:37:04 crc kubenswrapper[4774]: E0127 00:37:04.155144 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a\": container with ID starting with 4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a not found: ID does not exist" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.155210 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a"} err="failed to get container status \"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a\": rpc error: code = NotFound desc = could not find container \"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a\": container with ID starting with 4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a not found: ID does not exist" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.155255 4774 scope.go:117] "RemoveContainer" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" Jan 27 00:37:04 crc kubenswrapper[4774]: E0127 00:37:04.155763 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9\": container with ID starting with 2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9 not found: ID does not exist" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.155800 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9"} err="failed to get container status \"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9\": rpc error: code = NotFound desc = could not find container \"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9\": container with ID starting with 2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9 not found: ID does not exist" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.374983 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" path="/var/lib/kubelet/pods/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3/volumes" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.299237 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:37:11 crc kubenswrapper[4774]: E0127 00:37:11.300144 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-content" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300165 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-content" Jan 27 00:37:11 crc kubenswrapper[4774]: E0127 00:37:11.300188 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-utilities" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300198 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-utilities" Jan 27 00:37:11 crc kubenswrapper[4774]: E0127 00:37:11.300221 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300233 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300403 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.301089 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.304034 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6xqfx"/"openshift-service-ca.crt" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.305623 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6xqfx"/"kube-root-ca.crt" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.355996 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.422511 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.422894 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.524756 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.524885 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.525287 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.545714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.646269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:12 crc kubenswrapper[4774]: I0127 00:37:12.166108 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:37:12 crc kubenswrapper[4774]: I0127 00:37:12.366284 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:12 crc kubenswrapper[4774]: E0127 00:37:12.366517 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:13 crc kubenswrapper[4774]: I0127 00:37:13.133513 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerStarted","Data":"0020e17fd9488eae2abf19a74cbb871d1c681d3317583ca561ce17e499dde3ee"} Jan 27 00:37:19 crc kubenswrapper[4774]: I0127 00:37:19.192559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerStarted","Data":"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0"} Jan 27 00:37:19 crc kubenswrapper[4774]: I0127 00:37:19.193343 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerStarted","Data":"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb"} Jan 27 00:37:19 crc kubenswrapper[4774]: I0127 00:37:19.211296 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6xqfx/must-gather-qclb8" podStartSLOduration=2.28822916 podStartE2EDuration="8.211272004s" podCreationTimestamp="2026-01-27 00:37:11 +0000 UTC" firstStartedPulling="2026-01-27 00:37:12.160792647 +0000 UTC m=+1810.466569531" lastFinishedPulling="2026-01-27 00:37:18.083835491 +0000 UTC m=+1816.389612375" observedRunningTime="2026-01-27 00:37:19.208298404 +0000 UTC m=+1817.514075288" watchObservedRunningTime="2026-01-27 00:37:19.211272004 +0000 UTC m=+1817.517048898" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.489230 4774 scope.go:117] "RemoveContainer" containerID="14336beb86026bc90d2a9fb6311a3227ae7133bc262a6b935592389475831309" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.528287 4774 scope.go:117] "RemoveContainer" containerID="7bdfb69e9fa9696e37a7d9b25d175b63e5343122a1e569faca1c08955f8bbb1d" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.558258 4774 scope.go:117] "RemoveContainer" containerID="89911886b25b8c950e2ce162b903638384fd14c3a7c90ec3cfa3cf3f5c1e0950" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.605513 4774 scope.go:117] "RemoveContainer" containerID="49b5f230df2b7f05eeb23873ef8a668173f6d60ab2c8dbad4bbe6685a4aa6248" Jan 27 00:37:23 crc kubenswrapper[4774]: I0127 00:37:23.357162 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:23 crc kubenswrapper[4774]: E0127 00:37:23.357730 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:37 crc kubenswrapper[4774]: I0127 00:37:37.356403 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:37 crc kubenswrapper[4774]: E0127 00:37:37.357369 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:51 crc kubenswrapper[4774]: I0127 00:37:51.356901 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:51 crc kubenswrapper[4774]: E0127 00:37:51.359592 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:03 crc kubenswrapper[4774]: I0127 00:38:03.357878 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:03 crc kubenswrapper[4774]: E0127 00:38:03.358838 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:06 crc kubenswrapper[4774]: I0127 00:38:06.630659 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tjg8b_ec68f29b-2b30-4dff-b875-7617466be51b/control-plane-machine-set-operator/0.log" Jan 27 00:38:06 crc kubenswrapper[4774]: I0127 00:38:06.765762 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kv45n_7cb54417-3e44-4c36-a1cf-438a705f8dcf/kube-rbac-proxy/0.log" Jan 27 00:38:06 crc kubenswrapper[4774]: I0127 00:38:06.766098 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kv45n_7cb54417-3e44-4c36-a1cf-438a705f8dcf/machine-api-operator/0.log" Jan 27 00:38:17 crc kubenswrapper[4774]: I0127 00:38:17.357317 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:17 crc kubenswrapper[4774]: E0127 00:38:17.358238 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:19 crc kubenswrapper[4774]: I0127 00:38:19.090300 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-9mmc4_0a5e3cb3-d23f-405b-bb88-18159ee24067/cert-manager-controller/0.log" Jan 27 00:38:19 crc kubenswrapper[4774]: I0127 00:38:19.241240 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-hp7q6_84e34f19-48c2-4096-83e7-95b9fd16a1e6/cert-manager-cainjector/0.log" Jan 27 00:38:19 crc kubenswrapper[4774]: I0127 00:38:19.318387 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-gg42r_32c71eef-359e-42d1-a9c8-d0f392ecabaa/cert-manager-webhook/0.log" Jan 27 00:38:28 crc kubenswrapper[4774]: I0127 00:38:28.357175 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:28 crc kubenswrapper[4774]: E0127 00:38:28.358249 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:34 crc kubenswrapper[4774]: I0127 00:38:34.942667 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-qzvj5_82ad6e88-a32b-4f4f-9a96-66d10c58a7d9/prometheus-operator/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.143690 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-2b75r_dfee63d3-9a5d-46f6-b984-78d6a837e20c/prometheus-operator-admission-webhook/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.182913 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-548dl_4ac18775-726a-43da-a184-dfd1565544f1/prometheus-operator-admission-webhook/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.334873 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-fqps4_3247d37e-1277-411a-ad8b-ffcd6172206f/operator/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.383956 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b7pt6_96fc7bad-ed57-4110-afa8-9a6e5748c292/perses-operator/0.log" Jan 27 00:38:41 crc kubenswrapper[4774]: I0127 00:38:41.357197 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:41 crc kubenswrapper[4774]: I0127 00:38:41.932434 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9"} Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.583015 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/util/0.log" Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.778728 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/util/0.log" Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.812521 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/pull/0.log" Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.842006 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.015693 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.056010 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/extract/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.056596 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.240317 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.433625 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.450789 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.476227 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.645999 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/extract/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.755994 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.815283 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.867714 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.064571 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.065809 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.123909 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.373441 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.383700 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.384290 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/extract/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.555252 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.717780 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.757189 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.769601 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.960176 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.965456 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.977588 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/extract/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.151105 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.378686 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-content/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.389267 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-content/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.392282 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.575406 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.593589 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-content/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.809756 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.966007 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.968456 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/registry-server/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.005494 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.022553 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.187646 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.215197 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.304261 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bcz6w_fb20bbdd-0a13-4d07-8c4a-8c2285de3173/marketplace-operator/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.497555 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.614435 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/registry-server/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.681196 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.698120 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.706566 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.888171 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.907041 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-content/0.log" Jan 27 00:38:56 crc kubenswrapper[4774]: I0127 00:38:56.441654 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/registry-server/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.397934 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-qzvj5_82ad6e88-a32b-4f4f-9a96-66d10c58a7d9/prometheus-operator/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.404259 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-548dl_4ac18775-726a-43da-a184-dfd1565544f1/prometheus-operator-admission-webhook/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.406546 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-2b75r_dfee63d3-9a5d-46f6-b984-78d6a837e20c/prometheus-operator-admission-webhook/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.578871 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-fqps4_3247d37e-1277-411a-ad8b-ffcd6172206f/operator/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.622213 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b7pt6_96fc7bad-ed57-4110-afa8-9a6e5748c292/perses-operator/0.log" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.209779 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.212727 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.221009 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.388719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"infrawatch-operators-2njqk\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.490305 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"infrawatch-operators-2njqk\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.530579 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"infrawatch-operators-2njqk\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.830909 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.146709 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.582379 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerStarted","Data":"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471"} Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.583095 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerStarted","Data":"77313127db6b9d739b39d1ec129516ee15b3e8ddcd5f636849ae6284149574cf"} Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.621497 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-2njqk" podStartSLOduration=1.494574756 podStartE2EDuration="1.621454591s" podCreationTimestamp="2026-01-27 00:39:50 +0000 UTC" firstStartedPulling="2026-01-27 00:39:51.160767116 +0000 UTC m=+1969.466544010" lastFinishedPulling="2026-01-27 00:39:51.287646971 +0000 UTC m=+1969.593423845" observedRunningTime="2026-01-27 00:39:51.611196005 +0000 UTC m=+1969.916972929" watchObservedRunningTime="2026-01-27 00:39:51.621454591 +0000 UTC m=+1969.927231515" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.609409 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.612548 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.643509 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.667334 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.667402 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.667473 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.768649 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.768741 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.768775 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.769507 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.769930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.808956 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.947385 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.180269 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:39:55 crc kubenswrapper[4774]: W0127 00:39:55.188074 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfea77ced_5bd1_430a_81c4_494ff6946525.slice/crio-c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07 WatchSource:0}: Error finding container c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07: Status 404 returned error can't find the container with id c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07 Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.617946 4774 generic.go:334] "Generic (PLEG): container finished" podID="fea77ced-5bd1-430a-81c4-494ff6946525" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" exitCode=0 Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.618016 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37"} Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.618406 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerStarted","Data":"c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07"} Jan 27 00:39:56 crc kubenswrapper[4774]: I0127 00:39:56.629712 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerStarted","Data":"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2"} Jan 27 00:39:57 crc kubenswrapper[4774]: I0127 00:39:57.641548 4774 generic.go:334] "Generic (PLEG): container finished" podID="fea77ced-5bd1-430a-81c4-494ff6946525" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" exitCode=0 Jan 27 00:39:57 crc kubenswrapper[4774]: I0127 00:39:57.641646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2"} Jan 27 00:39:57 crc kubenswrapper[4774]: I0127 00:39:57.644365 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:39:58 crc kubenswrapper[4774]: I0127 00:39:58.654331 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerStarted","Data":"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b"} Jan 27 00:39:58 crc kubenswrapper[4774]: I0127 00:39:58.688036 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nc8j2" podStartSLOduration=2.253131854 podStartE2EDuration="4.68800349s" podCreationTimestamp="2026-01-27 00:39:54 +0000 UTC" firstStartedPulling="2026-01-27 00:39:55.621399897 +0000 UTC m=+1973.927176781" lastFinishedPulling="2026-01-27 00:39:58.056271533 +0000 UTC m=+1976.362048417" observedRunningTime="2026-01-27 00:39:58.683201791 +0000 UTC m=+1976.988978745" watchObservedRunningTime="2026-01-27 00:39:58.68800349 +0000 UTC m=+1976.993780434" Jan 27 00:40:00 crc kubenswrapper[4774]: I0127 00:40:00.831387 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:00 crc kubenswrapper[4774]: I0127 00:40:00.831485 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:00 crc kubenswrapper[4774]: I0127 00:40:00.876929 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:01 crc kubenswrapper[4774]: I0127 00:40:01.757763 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:03 crc kubenswrapper[4774]: I0127 00:40:03.596842 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:40:03 crc kubenswrapper[4774]: I0127 00:40:03.712634 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-2njqk" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" containerID="cri-o://9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" gracePeriod=2 Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.124599 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.176582 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"e8e943dd-43b5-4948-b976-abb2c181609b\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.185304 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76" (OuterVolumeSpecName: "kube-api-access-qxx76") pod "e8e943dd-43b5-4948-b976-abb2c181609b" (UID: "e8e943dd-43b5-4948-b976-abb2c181609b"). InnerVolumeSpecName "kube-api-access-qxx76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.279344 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.729230 4774 generic.go:334] "Generic (PLEG): container finished" podID="e8e943dd-43b5-4948-b976-abb2c181609b" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" exitCode=0 Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.729354 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.729393 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerDied","Data":"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471"} Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.731298 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerDied","Data":"77313127db6b9d739b39d1ec129516ee15b3e8ddcd5f636849ae6284149574cf"} Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.731416 4774 scope.go:117] "RemoveContainer" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.758786 4774 scope.go:117] "RemoveContainer" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" Jan 27 00:40:04 crc kubenswrapper[4774]: E0127 00:40:04.759454 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471\": container with ID starting with 9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471 not found: ID does not exist" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.759544 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471"} err="failed to get container status \"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471\": rpc error: code = NotFound desc = could not find container \"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471\": container with ID starting with 9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471 not found: ID does not exist" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.762487 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.769233 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.948257 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.948653 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:06 crc kubenswrapper[4774]: I0127 00:40:06.014646 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nc8j2" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" probeResult="failure" output=< Jan 27 00:40:06 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:40:06 crc kubenswrapper[4774]: > Jan 27 00:40:06 crc kubenswrapper[4774]: I0127 00:40:06.369464 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" path="/var/lib/kubelet/pods/e8e943dd-43b5-4948-b976-abb2c181609b/volumes" Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.769271 4774 generic.go:334] "Generic (PLEG): container finished" podID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" exitCode=0 Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.769422 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerDied","Data":"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb"} Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.770421 4774 scope.go:117] "RemoveContainer" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.931349 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6xqfx_must-gather-qclb8_0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/gather/0.log" Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.941949 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.943104 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6xqfx/must-gather-qclb8" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" containerID="cri-o://4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" gracePeriod=2 Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.949846 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.999511 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.078291 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.330414 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6xqfx_must-gather-qclb8_0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/copy/0.log" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.331172 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.416596 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.416832 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.428083 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7" (OuterVolumeSpecName: "kube-api-access-d48h7") pod "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" (UID: "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec"). InnerVolumeSpecName "kube-api-access-d48h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.488353 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" (UID: "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.519090 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.519149 4774 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.601081 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.857493 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6xqfx_must-gather-qclb8_0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/copy/0.log" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.858483 4774 generic.go:334] "Generic (PLEG): container finished" podID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" exitCode=143 Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.858597 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.858675 4774 scope.go:117] "RemoveContainer" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.890276 4774 scope.go:117] "RemoveContainer" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.958167 4774 scope.go:117] "RemoveContainer" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" Jan 27 00:40:15 crc kubenswrapper[4774]: E0127 00:40:15.958809 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0\": container with ID starting with 4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0 not found: ID does not exist" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.958878 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0"} err="failed to get container status \"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0\": rpc error: code = NotFound desc = could not find container \"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0\": container with ID starting with 4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0 not found: ID does not exist" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.958922 4774 scope.go:117] "RemoveContainer" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:15 crc kubenswrapper[4774]: E0127 00:40:15.959471 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb\": container with ID starting with 18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb not found: ID does not exist" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.959544 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb"} err="failed to get container status \"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb\": rpc error: code = NotFound desc = could not find container \"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb\": container with ID starting with 18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb not found: ID does not exist" Jan 27 00:40:16 crc kubenswrapper[4774]: I0127 00:40:16.373562 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" path="/var/lib/kubelet/pods/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/volumes" Jan 27 00:40:16 crc kubenswrapper[4774]: I0127 00:40:16.866679 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nc8j2" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" containerID="cri-o://7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" gracePeriod=2 Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.326097 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.363605 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"fea77ced-5bd1-430a-81c4-494ff6946525\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.363772 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"fea77ced-5bd1-430a-81c4-494ff6946525\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.363832 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"fea77ced-5bd1-430a-81c4-494ff6946525\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.366821 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities" (OuterVolumeSpecName: "utilities") pod "fea77ced-5bd1-430a-81c4-494ff6946525" (UID: "fea77ced-5bd1-430a-81c4-494ff6946525"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.380090 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn" (OuterVolumeSpecName: "kube-api-access-fk2cn") pod "fea77ced-5bd1-430a-81c4-494ff6946525" (UID: "fea77ced-5bd1-430a-81c4-494ff6946525"). InnerVolumeSpecName "kube-api-access-fk2cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.465820 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.466008 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.524372 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fea77ced-5bd1-430a-81c4-494ff6946525" (UID: "fea77ced-5bd1-430a-81c4-494ff6946525"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.567145 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.883822 4774 generic.go:334] "Generic (PLEG): container finished" podID="fea77ced-5bd1-430a-81c4-494ff6946525" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" exitCode=0 Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.883974 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b"} Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.884050 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.884115 4774 scope.go:117] "RemoveContainer" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.884087 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07"} Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.921082 4774 scope.go:117] "RemoveContainer" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.948065 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.962718 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.969658 4774 scope.go:117] "RemoveContainer" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.017641 4774 scope.go:117] "RemoveContainer" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" Jan 27 00:40:18 crc kubenswrapper[4774]: E0127 00:40:18.018697 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b\": container with ID starting with 7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b not found: ID does not exist" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.018770 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b"} err="failed to get container status \"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b\": rpc error: code = NotFound desc = could not find container \"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b\": container with ID starting with 7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b not found: ID does not exist" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.018818 4774 scope.go:117] "RemoveContainer" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" Jan 27 00:40:18 crc kubenswrapper[4774]: E0127 00:40:18.019455 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2\": container with ID starting with 4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2 not found: ID does not exist" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.019496 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2"} err="failed to get container status \"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2\": rpc error: code = NotFound desc = could not find container \"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2\": container with ID starting with 4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2 not found: ID does not exist" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.019540 4774 scope.go:117] "RemoveContainer" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" Jan 27 00:40:18 crc kubenswrapper[4774]: E0127 00:40:18.019917 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37\": container with ID starting with 8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37 not found: ID does not exist" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.019970 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37"} err="failed to get container status \"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37\": rpc error: code = NotFound desc = could not find container \"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37\": container with ID starting with 8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37 not found: ID does not exist" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.372921 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" path="/var/lib/kubelet/pods/fea77ced-5bd1-430a-81c4-494ff6946525/volumes" Jan 27 00:41:06 crc kubenswrapper[4774]: I0127 00:41:06.675769 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:41:06 crc kubenswrapper[4774]: I0127 00:41:06.676799 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:41:36 crc kubenswrapper[4774]: I0127 00:41:36.675613 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:41:36 crc kubenswrapper[4774]: I0127 00:41:36.676805 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.675643 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.678056 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.678308 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.679161 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.679312 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9" gracePeriod=600 Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072414 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9" exitCode=0 Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072842 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9"} Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072924 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"9542a14f7f81cbd726fa24778812835add869f0c4df1484cb45f72957fe0636b"} Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072955 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.194944 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197194 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-content" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197226 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-content" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197253 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197266 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197293 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="gather" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197307 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="gather" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197325 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197338 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197362 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197373 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197397 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-utilities" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197411 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-utilities" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197687 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197720 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197740 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="gather" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197771 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.199724 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.202350 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.316696 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.316781 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.316815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.419841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.419989 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.420030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.420612 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.421528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.450683 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.534365 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.918942 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:42:49 crc kubenswrapper[4774]: I0127 00:42:49.552294 4774 generic.go:334] "Generic (PLEG): container finished" podID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" exitCode=0 Jan 27 00:42:49 crc kubenswrapper[4774]: I0127 00:42:49.552417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322"} Jan 27 00:42:49 crc kubenswrapper[4774]: I0127 00:42:49.552459 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerStarted","Data":"d8bdf409fa5b9682622fe9503e85e6d67518d56a44cf740ce575fc7a5663caa7"} Jan 27 00:42:51 crc kubenswrapper[4774]: I0127 00:42:51.595167 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc"} Jan 27 00:42:51 crc kubenswrapper[4774]: I0127 00:42:51.595192 4774 generic.go:334] "Generic (PLEG): container finished" podID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" exitCode=0 Jan 27 00:42:52 crc kubenswrapper[4774]: I0127 00:42:52.614666 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerStarted","Data":"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f"} Jan 27 00:42:52 crc kubenswrapper[4774]: I0127 00:42:52.659951 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q86gp" podStartSLOduration=2.192490775 podStartE2EDuration="4.659913335s" podCreationTimestamp="2026-01-27 00:42:48 +0000 UTC" firstStartedPulling="2026-01-27 00:42:49.557576743 +0000 UTC m=+2147.863353647" lastFinishedPulling="2026-01-27 00:42:52.024999323 +0000 UTC m=+2150.330776207" observedRunningTime="2026-01-27 00:42:52.64560051 +0000 UTC m=+2150.951377474" watchObservedRunningTime="2026-01-27 00:42:52.659913335 +0000 UTC m=+2150.965690249" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.535943 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.536750 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.625968 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.743701 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.883994 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:43:00 crc kubenswrapper[4774]: I0127 00:43:00.694277 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q86gp" podUID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerName="registry-server" containerID="cri-o://1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" gracePeriod=2 Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.233036 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.285541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.286837 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities" (OuterVolumeSpecName: "utilities") pod "f342913b-d5a0-45a5-a4ec-c48a7d9adb39" (UID: "f342913b-d5a0-45a5-a4ec-c48a7d9adb39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.293983 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.294113 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.294936 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.308260 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22" (OuterVolumeSpecName: "kube-api-access-dvr22") pod "f342913b-d5a0-45a5-a4ec-c48a7d9adb39" (UID: "f342913b-d5a0-45a5-a4ec-c48a7d9adb39"). InnerVolumeSpecName "kube-api-access-dvr22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.396504 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") on node \"crc\" DevicePath \"\"" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.414234 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f342913b-d5a0-45a5-a4ec-c48a7d9adb39" (UID: "f342913b-d5a0-45a5-a4ec-c48a7d9adb39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.499287 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.708661 4774 generic.go:334] "Generic (PLEG): container finished" podID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" exitCode=0 Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.708766 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f"} Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.709212 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"d8bdf409fa5b9682622fe9503e85e6d67518d56a44cf740ce575fc7a5663caa7"} Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.709250 4774 scope.go:117] "RemoveContainer" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.708884 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.760110 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.765813 4774 scope.go:117] "RemoveContainer" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.767175 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.798809 4774 scope.go:117] "RemoveContainer" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.833384 4774 scope.go:117] "RemoveContainer" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" Jan 27 00:43:01 crc kubenswrapper[4774]: E0127 00:43:01.834554 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f\": container with ID starting with 1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f not found: ID does not exist" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.834630 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f"} err="failed to get container status \"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f\": rpc error: code = NotFound desc = could not find container \"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f\": container with ID starting with 1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f not found: ID does not exist" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.834684 4774 scope.go:117] "RemoveContainer" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" Jan 27 00:43:01 crc kubenswrapper[4774]: E0127 00:43:01.835446 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc\": container with ID starting with bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc not found: ID does not exist" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.835487 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc"} err="failed to get container status \"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc\": rpc error: code = NotFound desc = could not find container \"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc\": container with ID starting with bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc not found: ID does not exist" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.835517 4774 scope.go:117] "RemoveContainer" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" Jan 27 00:43:01 crc kubenswrapper[4774]: E0127 00:43:01.836005 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322\": container with ID starting with 23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322 not found: ID does not exist" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.836258 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322"} err="failed to get container status \"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322\": rpc error: code = NotFound desc = could not find container \"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322\": container with ID starting with 23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322 not found: ID does not exist" Jan 27 00:43:02 crc kubenswrapper[4774]: I0127 00:43:02.370434 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" path="/var/lib/kubelet/pods/f342913b-d5a0-45a5-a4ec-c48a7d9adb39/volumes"